VNetPros Twitter Updates
Multipoint Servers are your Small Businesses Future
Built on Windows Server technology, MultiPoint Server enables multiple local stations to be connected to one computer. Several users can then share one computer at the same time, which enables each user to perform independent work or a group activity.
Users have their own independent and familiar Windows computing experience, using their own monitor, keyboard and mouse directly connected to the host computer. Windows MultiPoint Server 2011 enables more users to access technology at a lower total cost of ownership. It is designed to be simple to manage and use.
MultiPoint Server includes MultiPoint Manager, which helps you, as an administrative user, to monitor and manage MultiPoint Server stations. In contrast to IP-based thin clients, you can connect the MultiPoint Server to each station without connecting it to any other network.
Unlike other solutions on the market, Windows MultiPoint Server 2011 is based on the latest Windows technology and thus can run Windows applications, and support can be obtained through Microsoft or HP.
Broad Client Support
• Introduction of LAN option provides ability to connect PCs, thin clients, and network monitors.
• Support for both low-cost locally created stations and conventional thin clients
Easiest Setup Management
• Day-to-day use designed for non-IT
• Easy setup and system management
• Domain join capability
• Pros across multiple servers
• Powershell and Hyper-V support for admins
• Rich set of built-in features enables administrator to easily monitor and control stationshttp://www.blogger.com/img/blank.gif
• Exercise command and control over the environment by monitoring and interacting with station thumbnails
• Shares the Windows Server Solutions SDK with Small Business Server 2011
• Leverage existing controls or create from scratch with WPF
For up to date information and services regarding MultiPoint Sever 2011 and how it can benefit your small business office contact vnetpros.com!
Also referred to as a page file or paging file, a swap file is a file stored on the computer hard drive that is used as a temporary location to store information that is not currently being used by the computer random Access Memory (RAM.) By using a swap file a computer has the ability to use more memory than what is physically installed in the computer. However, users who are low on hard drive space may notice that the computer runs slower because of the inability of the swap file to grow in size.
Buy Windows 7 PC, get Windows 8 Pro for $14.99
Attention, valued Clients:
Through the Windows Upgrade Offer, customers who purchase a qualifying Windows 7 PC from June 2, 2012, through January 31, 2013, will receive a promotional offer for a downloadable copy of Windows 8 when available.
• First, your customer buys a qualifying PC running Windows 7 Home Basic, Home Premium, Professional, or Ultimate from June 2, 2012, through January 31, 2013.
• Second, he or she visits the promotional website to register for the promotion for a Windows 8 Upgrade.
• Third, customer comes back when Windows 8 is available for purchase and download.
• Must have Windows 7 PCs preinstalled with Windows 7 Home Basic, Home Premium, Professional, or Ultimate
• Must have been purchased between June 2, 2012, and Jan 31, 2013
The offer must be redeemed prior to Feb. 28, 2013.
Offer valid June 2, 2012, through January 31, 2013. For complete details, visit http://windowsupgradeoffer.com.
Microsoft in 2019
What will the future bring? We can never know. I just know that we can always dream of a better tomorrow and dreaming is free. As an Information Technology Professional I would like to share this video clip I found.
See the possibilities of technology and how it would transform the way we work, play and live.
Windows XP End of Life
XP RAM Upgrade
What I have recommended my clients to do, to extend the life of their Windows XP business computers and to help extend the life of their current computer investment is to purchase the fastest RAM that workstation can handle. As most XP operating systems are 32 bit it is easy to state they can only physically install up to 4GB of RAM as the maximum hardware and operating system limitation.
These are the steps I would take to find out how much RAM is installed on my computer:
possible that would work with the computer model has had a very long run. The general availability was on December 31, 2001 and the end of sale for preinstalled was October 22, 2010. End of support for Windows XP was back on April 14, 2009 and the extended support is April 8, 2014
5 Tips to Reduce Web Threats Risks
1. Keep your systems patched and up to date.
Keeping systems fully up to date—especially the operating system (Windows XP, Vista or Windows 7,) web browsers (Internet Explorers, Chrome, Mozilla;) browser plugins, media players, PDF readers and other applications—this task can be a tedious, annoying and time-consuming ongoing task. Unfortunately, hackers are counting on most people to fall far short of what’s needed to keep their systems up to date. We will ensure this task s one of the ways we are making IT easier for you.
2. Standardize your web software.
If you’ve just read point number 1, you’re probably still thinking that keeping systems fully patched and up to date is an onerous task. What makes this worse is if you don’t know what software is running on your network or you have a variety of individuals using different browsers, plugins and media players.
3. Secure your browsers.
You must familiarize yourself with the plethora of security, privacy and content settings that all browsers have in order to understand the tradeoffs. Some security settings will merely increase the level of prompting—annoying users without adding any tangible security—while others can be important to limiting exploits and threats.
4. Enforce a strong password policy.
The purpose of a password policy should be obvious: If you don’t want everyone to have access to something, you set up passwords to permit access only to authorized users. The purpose of an effective password policy is to keep passwords from being easily guessed or cracked by hackers. Despite this enormous vulnerability in every system, many organizations fail to take this threat seriously.
5. Use an effective web security solution.
A proper web security solution is a vital component of an overall strategy for safeguarding your organization from modern web threats. It will reduce your threat exposure by limiting users’ surfing activity to website categories relevant to their work, or at least help them avoid the dirty dozen categories (adult, gambling, etc.) that are a breeding ground for malware. It will also protect you from trusted sites that you visit daily that may become hijacked at any time to silently spread malware to unsuspecting visitors. Finally, it will also help protect your internet resources from abuse as a result of the exchange of illegal content or bandwidth-sapping streaming media.
Protecting yourself on the Web
Keeping the latest protection for the computer is paramount it is therefore value your time to discover the most optimal antivirus software of 2012. There are a lot of fine programs to pick from and some will come across the needs you have a lot better than others and installing some type of helpful antivirus program on your desktop is really important. If not, you’re preparing yourself for a plethora of problems.
Kapersky Internet Security 2012 can be an advanced antivirus program that constantly protects from a wide range of threats. It discovers and reacts to every hazard in real time plus it averts attacks through benefit of a novel two-way Firewall. Miracle traffic bot is furthermore capable to recognize and confiscate malware that’s geared towards your pc when you are connected via a USB or any other connection you aren’t even online. Kapersky merchandise has became have striking scores, from your independent antivirus testing that has been done about it.
Miracle traffic bot is priced rather high at $79.95, although if you research prices the world-wide-web, you may be capable to discover it which has a discounted. It is a fact that Apple is normally impervious to virus attacks, yet the possibility is still there. The harder Macintosh machines made, as well as the very popular it might be, more viruses is going to be made especially for these computers. A Mac operating in Windows could get viruses, especially if you have forgot to get virus protection to protect against these applications. There are many good antivirus programs for Mac, and something of the best is ProtectMac AntiVirus. Viruses that will attack both Windows and Macintosh computers is going to be blocked. Well liked, Sophos Anti-Virus for Mac Home Edition is an excellent free software program that can install on your Mac and guarded from viruses. That is the handy item that I actually have obtained, together with iPad cover with keyboard, along with Coach iPad case and furthermore Apple iPad accessories.
Prior to you buying antivirus software, you have to consider what type of computer user you’re. In case you spend lots of time playing games, for instance, you should be extra careful about any threats that may arise with this environment. Not every security programs can be manufactured with the intent of assisting you to deal well using this type of. Some gamers don’t turn their antivirus program on after they play this also means they are vulnerable to Cyber threats which can be fond of them. Nonetheless, there exists some software specifically that may work when you are in the heart of a game plus it won’t interrupt your game.
A high level parent, you may choose a plan that offers parental controls. You need to ensure that the antivirus software you’re utilizing is able to support your operating-system, particularly when your pc is older. To come to the point, you must do more than simply purchase the greatest antivirus software; you additionally need to get the one that will meet your needs.
It is important for Antivirus programs to constantly becoming highly developed, because they must differentiate themselves from the identity thieves, hackers and other online predators. When you are figuring out just what the perfect antivirus solution for 2012 is, consider the buying price of the program (if any), if it provides you with standard updates as well as the amount of protection it provides. Saving your pc and private information from harm is one thing you’ll need to be intent on.
Hassle Free IT!!
Let’s face it. Your current IT Service Provider profits when your systems are down – even though you are paying him to maintain your network. We operate with a different set of values. Our clients enjoy a fixed monthly fee that never fluctuates regardless of how many service calls or problems. We can only profit when our customers don’t face IT disasters. Out of necessity, we must go that extra mile to proactively manage, secure and improve your network to keep IT disasters at bay.
Enterprise Level Solutions for the Small Business Budget
Flat Rate IT Plans
Covers Servers, Desktop and network devices
Our helpful and friendly staff is available 24/7
Services By the Hour
Call us today: (858) 633-1800
Or visit us online at http://www.VNetPros.com
Small Business Server 2011
Windows Small Business Server (SBS) is an affordable, all-in-one solution to reduce complexity and increase manageability of server technology in a small business environment.
Your all-in-one network solution, designed and priced for small businesses with up to 75 users. SBS 2011 Standard delivers enterprise-class server technology in an affordable, all-in-one solution. SBS 2011 Standard helps protect your business information from loss by performing automatic daily backups. Additionally, it allows users to be more productive with features such as e-mail, Internet connectivity, internal websites, remote access, and file and printer sharing.
Remote Web Access. Remote Web Access provides a single, simple, consolidated, and highly secure entry point into a small business network. Access files and documents from inside and outside the business through any common web browser.
Desktop Synergies with Windows 7 and Office 2010. By combining Windows 7 and Microsoft Office 2010 with Windows Small Business Server 2011, you will have the IT foundation you need to be more efficient and effective, to easily collaborate with your peers, to work remotely, and to feel confident that all your critical business data is protected.
Mobile Device Support. Integrated setup features configure collaboration services so that you can easily add Windows Phone or other Internet-enabled phones.
Run Business Applications. Supports critical line-of-business applications and runs them on a secure award winning platform.
Tailored to Online Services. Provides a cross-premise solution, allowing small businesses to retain core infrastructure and enables simple, single sign-on experiences with cloud-based services.
10 indispensable iPhone apps for IT administrators
10 indispensable iPhone apps for IT administrators. Working while mobile is becoming a requirement for IT administrators. I see two choices. Either carry a notebook and data card or use a smart phone with equivalent capabilities. Apple’s iPhone, with the following applications, make that decision relatively simple.
1: Analytics App ($5.99 US and rated 4+)
This application (Figure A and Figure B) gives immediate access to Google Analytics, allowing prompt feedback on Web site traffic. Using this app is easier and quicker than the actual Analytics Web site. For more information, visit the Inblosam, LLC Web site.
2: LogMeIn Ignition ($29.99 US and rated 4+)
I use LogMeIn extensively, yet I balked at getting this app because of the price. Then I thought why not use my iPhone instead of a notebook and expensive data card? All of a sudden, 30 dollars didn’t seem like much. If you aren’t convinced, LogMeIn offers a free trial of Ignition (Figure C) on its Web site.
3: Network Utility Pro ($0.99 US and rated 4+)
This one offers a lot of capability for one dollar: Ping, TCP/IP port scan, GeoIP lookup, and Whois query. All the utilities work well, with the exception of GeoIP lookup. It never provided the correct location. For more details about Network Utility Pro, refer to Codepacity’s Web site. Figure D and Figure E show the available utilities and the results of a Whois query.
4: Network Ping ($3.99 US and rated 4+)
Network Ping is a series of network tests (Ping, Ping a subnet, Traceroute, and Telnet) ported to the iPhone. I prefer this app over Network Utility Pro when it comes to Pinging. It remembers past queries. For more detailed information, check out MochaSoft’s Web site. You can see the available utilities and the results of a trace route in Figure F and Figure G.
5: RDP Lite (Free and rated 4+)
RDP Lite is a helpful application when dealing with networks containing Windows XP Pro, Vista, or Windows 7 computers. It allows remote access of workstations, solving all sorts of logistics issues. RDP Lite is another application from MochaSoft. Figure H and Figure I show the configuration page and the log-on window.
6: SIO to Go (Free and rated 4+)
Cisco has a project called Security Intelligence Operations. It is a global threat-monitoring network. Zeek Interactive, along with Cisco, developed an iPhone app that delivers SIO early warning intelligence, threats, and Cisco-built solutions. The app also allows you to check the reputation of an e-mail or Web site address (Figure J). Figure K shows current security items of interest.
7: Snap ($1.99 US and rated 4+)
Simple Network Area Prober (SNAP) locates all active devices on the network. It displays both IP and MAC addresses, as well as services of each device found. It’s a great tool for network administrators who need to keep track of devices. 9Bit Labs is responsible for this handy app. Figure L and Figure M show an in-process scan and the results.
8: Speedtest Pro ($0.99 US and not rated yet)
Speedtest Pro is a simple application for evaluating the bandwidth of the iPhone’s 3G, EDGE, or Wi-Fi connection. Several bandwidth apps are available for the iPhone, but few register latency. This app was developed by Xtreme Labs. You can see a completed test and a comparative history in Figure N and Figure O.
9: Telnet ($1.99 US and rated 4+)
Telnet allows the iPhone to connect to standard telnet servers running Linux, BSD, Solaris, OS X, Cisco, or Windows operating systems. I consider this a must-have application. Throughput Inc developed the client and recently released several improvements. Figure P shows the setup page. Figure Q shows an actual connection.
10: WifiTrak ($0.99 US and rated 4+)
WifiTrak scans for available Wi-Fi networks. The app displays a list of networks, prioritized from most usable (open and strongest signal) to least usable (secure and weakest signal). The application was developed by Bitrino, Inc. Figure R and Figure S show the ranking of available networks and specifics for the mjvn network.
Two more iPhone apps
Where are most device labels? On the back, of course. Instead of struggling to see the label, I reach around with my iPhone and take a picture. And being older, I find small print is getting tough to read. That’s where the iMagnify application comes in handy.
There are occasions when I wish I had a flashlight with me. While researching this article, I came across an app called Flashlight. It’s not perfect, but it’s better than the iPhone’s regular display.
By Larry Dignan is Editor in Chief of ZDNet and Editorial Director of ZDNet sister site TechRepublic.
Why we Patch Computers with Security Updates
90% of security vulnerabilities can be patched. Yet, many computers remain at risk because patching is hard. It is also hard to determine which computer needs patches and which don’t. IT Managers, Office managers and staff members don’t know which patches are needed to prevent threats and reduce the attack surface.
VNet Professionals Inc.’ trained technical staff can help you with this daunting task and keep your computers updated. As part of our service to our clients it is our responsibility to maintain the operating systems on the desktops (Windows XP/Windows 7) and servers (Windows 2003, 2008 and 2011) the productivity applications such as Microsoft Office, QuickBooks, and other specific business line applications and all utility software such as antivirus, antispam, etc. Knowing when these updates are available and when to apply them is important.
We make this one of our responsibilities and are constantly on guard to protect the data that is on these network computers, which are the building blocks of our local economy and your business.
Here are some reasons why this is important:
• One of the most valuable asset is your corporate data
• The threats un-patched computers pose a great risk to any business
• Common security threads can be minimized by proactive patching
Our proactive approach eliminates downtime, theft and data loss.
VNet Professionals Inc. Give us a call with any questions.
6 tips to save time with Microsoft Outlook
Responding in a timely manner is crucial to business, so keep these Microsoft® Office Outlook® tips at your disposal to stay current and organized.
1. Keep a tidy inbox
The most important thing to maintain is also sometimes the most difficult: a clean inbox. Don’t be afraid to delete! Having a spring cleaning session once every few months is good for the soul—It gets rid of clutter. Deleted items can be permanently removed from your account by clicking Empty. Just be sure there’s nothing you will need in the future. You can also archive old items with the Auto Archive option. Having organized topic folders is helpful, as long as there aren’t so many subdivisions that it gets confusing. You can also store mail in the vault, which will automatically archive messages.
2. Never see “Your mailbox is almost full” again
Folders are a great way to maintain organization; however, unless folders are stored on your hard drive, they will still clog your inbox. Storing these on your hard drive will reduce the messages you have in your inbox, therefore making sure there is enough space, and avoiding the dreaded “Your mailbox is almost full” pop-up.
For more information on saving to a Personal Folder file (.pst), visit this Microsoft Office How-to. 
On the File menu, highlight New, and click Outlook Data File.
Select Office Outlook Personal Folders File (.pst), and click OK.
In the File name box, type a name for the file, then click OK.
In the Name box, type a display name for the .pst folder.
You can add a password of up to 15 characters.
Use strong passwords that combine uppercase and lowercase letters, numbers, and symbols. Weak passwords don’t mix these elements. Strong password: Y6dh!et5. Weak password: House27. Passwords should be 8 or more characters in length. A pass phrase that uses 14 or more characters is better. For more information, see Help protect your personal information with strong passwords.
It is critical that you remember your password. If you forget your password, Microsoft cannot retrieve it. Store the passwords that you write down in a secure place away from the information that they help protect.
If you select the Save this password in your password list check box, make a note of the password in case you need to open the .pst file on another computer. Select this check box only if your Microsoft Windows user account is password-protected and no one else has access to your computer account.
Important Neither Microsoft, your Internet service provider (ISP), nor your mail administrator has access to your password. No one can open or recover the contents of the .pst file if you forget the password.
The name of the folder that is associated with the data file appears in the Folder List. To view the Folder List, on the Go menu, click Folder List. By default, the folder will be called Personal Folders.
You can also click on “Store in Vault” to select which items you would like to keep in cyberspace- in case you ever need them again.
3. Create a new message shortcut
For those of you who email frequently, setting up pre-addressed email templates and storing them in a shortcut folder on your desktop will cut out a lot of excess typing (and undoubtedly lower your risk of Carpal Tunnel). To create this shortcut, right click on your desktop:
In ‘Location of Item’ type ‘mailto:’ followed by the recipient’s email address (all with no spaces)
After clicking ‘Next,’ you will be able to name your new shortcut folder
Once you are finished, you will have a pre-addressed email ready to go, every time you click the icon.
4. Out of Office replies are key
Out of Office replies will keep your colleagues and contacts from spending precious time wondering if and why you are avoiding them. While you still have the ability to check your email remotely, having Out of Office enabled will allow others to know that you may not receive their email until you return. It’s a good idea to include emergency contact information so those who need to get in touch with someone can do so easily. As long as your server runs on Microsoft Exchange, setting up Out of Office is simple:
Select Out of Office Assistant
Check Send Out of Office Replies, select date range, and type message
For more details, click here.
5. Spell checking is the cardinal rule of anything ever written
It is a tool that makes us wonder how we ever got along without it. There are simple ways to check and correct spelling in any e-mail. If you right click on a misspelled word, other spelling options for that word will appear, allowing you to choose one. To spell check an entire email, simply press F7 on your keyboard and you will be able to check the entire document for both spelling and grammar.
6. Know keyboard shortcuts
Keyboard shortcuts are convenient to quickly check, send, and save emails. Some of the most commonly used shortcuts (+ means ‘in combination with’; not the + ‘addition’ key) are:
CTRL + D Delete
CTRL + R Reply
CTRL + F Forward
CTRL + N New Message
CTRL + P Print
CTRL + C Copy
CTRL + V Paste
CTRL + X Cut
ALT + S Send
F7 Spell Check
F9 Check for Mail
F12 Save Document As
Up arrow Next Message
Down arrow Previous Message
For a full list of keyboard shortcuts for Outlook, click here.
Outlook tips and tricks allow you to stay on top of things, whether you’re in the office or on the go. Both standard and customizable shortcuts will make your life a lot easier when it comes to interacting efficiently via e-mail. The better you are at communicating quickly with your colleagues and contacts, the better they will be at communicating with you.
Hard Drives hard to find?
Hard disk drive supply shortages in the wake of Thailand flooding will continue to affect consumers, computer system manufacturers and corporate IT shops into 2013, according to market research firm IDC.
Because of the shortages, hard drive prices have skyrocketed over the past month, in some cases as much as 100%.
Despite concerns about rising HDD costs, there are indications that prices are starting to settle down.
According to Infoworld, HDD price tracking site the Camelegg chart, which tracks prices at Newegg, showed the Western Digital 2TB Caviar Green Western Digital20EARS hit a low of $69.99 just before the flood. A month later, on Nov. 10, it had soared to $249.99—an increase of 250%. Today the drive sells at Newegg for $162.99.
However, not all prices are on the downswing. For example, on PriceGrabber.com, the price of a Seagate Barracuda 1TB 7,200-rpm drive has climbed steadily from an average $140 in late Oct. to $192 today.
VNet Professionals Inc. has them and can deliver next day.
U.S. Small Medium Business intend to purchase 3.6 million Ultrabooks
Techaisle’s recent study on purchase intention of Ultrabooks reveals that at least 3.6 million Ultrabooks will be purchased by US SMBs in 2012, resulting in 1 in 5 PCs shipped to SMBs. While only 23 percent of SMBs are aware about Ultrabooks, 65 percent of those aware have shown intent to purchase. Providing better business functionality than tablets, declining prices and increased marketing from Intel and its OEM partners, SMB Ultrabook shipment could jump to as high as 7 million as shown in the study. With increased mobility, size and weight of mobile PCs have become important factors for road warriors, who want to be able to work from anywhere and everywhere they go.
While Ultrabooks are considered more stylish and cool as compared to other form factors, including tablets, pricing is an important issue for SMBs. Nevertheless, Ultrabooks have created enough excitement among SMBs and combining with Windows 8 gives it a compelling purchase decision.
Among the features of Ultrabooks: long battery life, lightweight, built in security features, ability to run Windows 8 and fast boot times are important. Once the SMBs begin to use Ultrabooks, they will also find instant on, always on capability as a strong feature.
Looking forward, nearly 50 percent of SMBs have expressed their desire to purchase Ultrabooks instead of notebooks. When asked about comparing Ultrabooks with tablets in terms of mobility and performance, there was almost an equal split.
Says Tavishi Agrawal, Techaisle “Surprisingly, 70 percent SMBs felt that Ultrabooks are better than MacBook Air. While MacBook Air may have created the initial buzz, most SMBs feel tied to the Windows platform and are also enticed by the lower starting prices of Ultrabooks than MacBook Air.”
10 ways to future-proof your business
Knowing what your business will look like in five or ten years is important, but your long-term goals should never detract from your short-term objectives. If you don’t have specific, measurable, action-oriented and—most importantly—realistic short-term objectives in place, your business may not have much of a future. It is important to recognize this as the market place is always changing especial in our industry Information Technology.
One technique that businesses can utilize when looking to the future is called “flash foresight.” This way of thinking allows businesses to look forward and unearth previously invisible opportunities, as well as transform their findings into a model that they can use to solve any problems. By knowing what to expect down the road, businesses can future-proof their technology and staffing needs accordingly, ensuring they’re prepared for any obstacles they may encounter.
Here are ten things you should consider when preparing to future-proof your business:
1. Outline your needs
Take a look at your business’s main functions and growth plans for the next few years. By doing this, you will reveal what your needs are and how your current technological priorities will line up with them. Most of the time this is dictated by business line applications, such as accounting, inventory or a process driven application.
2. Use hard trends to see what’s coming
Seeing trends before they happen can often be invaluable to your business. For instance, VNet Professionals was able to foresee trends of accelerating decline in break fix support amongst businesses and was able to successfully change its support model to a month flat fee model, utilize that knowledge to create options for our clients has been invaluable and has keep us in the lead. On the other hand, many of our competitors did not recognize the decline in companies’ IT budgets expenditure or the prolong exposure to the recession as stretching the longevity of current computer systems was expected to be extended for several more years diminishing the need for services and replacement projects. This frailer to recognize certain trends like these past years resulted in IT companies going out of business in the San Diego area. I personally know of six IT Firms that failed during the recession.
Being able to differentiate between cyclical market changes (stock market, gas prices) and linear changes (new business and vertical market growth), and hard trends (aging baby boomers) and soft trends (not enough doctors to treat aging baby boomers), will help your company make accurate predictions.
3. Past strategies may fail to engage new customers
Has your business been around for more than five years? Do you still rely on print campaigns, direct mail, or newspaper and magazine advertising? Are you getting the same return on investment from these strategies? If so, it might be time to try something new. Don’t hesitate to put emerging technology like social media and mobile applications to use.
4. Use cloud-based services
Rather than purchasing expensive hardware and software that quickly becomes outdated, investigate cloud-based technology. By utilizing cloud computing, your company can get rid of bulky on-site servers and have virtual access to all types of cloud services. Employees can even access important files while on the road by using any computer or smartphone with an Internet connection. We believe in the power of the cloud.
5. Take advantage of new technology
Whether it’s a new PC or a smartphone, what you buy today can technically be obsolete tomorrow. For a long time businesses have just accepted this, but it is possible to avoid it altogether. Our way to counteract this expensive constant change is to take ownership of your business IT needs and address them by providing you with an office in a box approach, incorporate all the major business productivity tools you need for the next three years—for a fraction of the cost of replacing your current old PCs.
6. Don’t depend on one part of your business for complete success
What worked for you last year might not work for you this year. If you have a service or product that has been working for you over the last three to six months? Perhaps it’s time to shift your focus to that service or product to see what it can do for your business.
7. Go against the competition
Have you ever stopped to look at what your competition is doing? Have you ever thought about doing the exact opposite? Surprisingly, sometimes that’s the best thing to do. As a matter of fact that is what we have done to stay ahead.
8. Maintain service contracts
Think of your IT systems as the engine that runs your business. If you don’t maintain the engine, the business will stop running smoothly. That’s why it’s important to maintain all of your service contracts that cover your hardware, software, and peripheral devices. Oftentimes, these contracts can be a challenge to keep updated, we take the guess work out of it, making it easy for businesses to reduce business disruption. Let us keep your business engine running smoothly with our 24 hour support services.
9. Ease into new strategies or marketing platforms
Are you implementing a strategy or service that is no longer making you money? Don’t do it anymore. The money and time you invest in something that isn’t producing results can be better spent elsewhere. Consider investing that time and money into an emerging platform such as mobile advertising or inbound marketing and see what happens. For years I have always said it is easier for clients to find you than you find clients.
10. Undertake constant research and remain vigilant
Businesses are always being watched and copied by others who are interested in improving upon your ideas and techniques. Your business is not immune from this. It’s just a matter of time until someone with a cheaper or better product hits the market, and the only way to protect yourself is to remain constantly proactive. In our IT industry (our business) as in every business soon or later you need to innovate and defend. An expensive lesson I have learned throughout the four IT Firms I have owned her is San Diego, CA.
Understanding what long-term goals mean to your business on a daily basis will help you establish your short-term objectives. Your company’s goals will only be effective if you have a clear vision of what you want to achieve and how you want to do it. Let your business dictated the needs, if your business demands it to be more productive, save you money or make you money then you have a legitimate necessity.
Give us a call for a free consultation if you think there is requirement to improve your business in any way.
VNet Professionals believes in Cloud Power
VNet Professionals Inc. believes in Cloud Power and it’s going to change the way you do business. And it’s going to change your definition of power. Cloud Powergives you the power to think big and to help your business adjust to its growing demands or shirking needs, by doing more with fewer resources as many thousands of San Diego small businesses.
Adjusting to the ever-changing economy swings and new methodologies of doing business. Now and never before you have the power to grow, or shrink and not suffer for either. The power to do more with less is Cloud Power. Cloud Power means having the most comprehensive business applications and deliver them instantly to any one in your organization weather they are at your office, accross town or in another part of the world. Now this is the power of the cloud and it is available for your business TODAY. With familiar tools that are simply more expansive, more accessible, more compatible, and more user-friendly to more users. That’s Cloud Power. Call us for more information.
Unlimited Online Backup
We help our clients protect their data without paying oppressive per GB fees to an online backup provider. How do we do it? We have partnered with the revolutionary online solution that leverages the power of Cooperative Storage Cloud as a decentralized, distributed cloud storage solution each system backing up data to the cloud, we contribute an equal amount of local storage from our data center that would be used by the cooperative. In short, we are part of the cloud.
How It Works:
Step 1: Data from your local backup system are encrypted on your computer using federally certified 256-bit AES algorithms. Data remain encrypted at all times.
Step 2: The encrypted data are divided into redundant fragments using industry standard Reed-Solomon encoding for robust durability and high availability.
Step 3: All data files are automatically mirrored into the storage cloud. The system is truly set-n-forget. This lower cost solution eliminates the need for user intervention This adds offsite protection to your onsite backup. The fragmented data are randomly distributed to multiple destination nodes in the network to ensure that no single node holds all the data. Eliminating the single point of failure.
Step 4: Data can be retrieved and/or restored at any time. Data are sent and retrieved in parallel to maximize the speed of backup and restore. We can restore your failed server in 90 min. or less. We eliminate long periods of costly down time, and loss of productivity.
Step 5: Data are decrypted and available for access on your local system as there will always be a local copy at every of our client’s sites.
Intel Cloud Computing 2015 Vision
Where Will Your Cloud Take You?
Content, opinions, and approaches on cloud computing infrastructure abound. Additionally, there may be varying visions, frameworks, and definitions of what a cloud end state might look like or how we can collectively get there.
Undeniably, cloud computing is an important transition, a paradigm shift in how IT services are delivered—one that has broad impact and can present significant challenges and opportunities. Cloud computing represents a transformation in the design, development, and deployment of next-generation IT services based on flexible, pay-as-you-go business models requiring highly efficient and scalable infrastructure. In a cloud computing environment, services and data reside in shared, dynamically scalable resource pools, often virtualized.
Today, more and more data centers find themselves facing real limits, whether based on lack of power, lack of room, lack of server capacity, or lack of network bandwidth. Expanding traditional infrastructure to meet these challenges quickly uncovers multiple inherent inflexibilities. The resulting complexity breeds cost, deployment risk, and operational risk.
Cloud computing offers the promise of increased agility, reduced costs, greater innovation, and improved TCO. The benefits of cloud computing are best realized through open, interoperable, multi-vendor solutions.
Intel Cloud 2015 Vision
Intel’s Cloud 2015 vision represents our view on the data center transformation underway being driven by the rapid growth in users, data, and services and in the range of connected devices across the globe. IT is already facing significant challenges around space, power, and costs, among others, and this growth of users, data, and devices is placing new requirements and demands on the data center. A new class of solutions is emerging to address the evolution of the data center. Intel’s vision is cloud computing that is federated, automated, and client-aware, and which is built using open, interoperable, multi-vendor solutions to truly realize cloud’s promise.
• Federated: Communications, data, and services can move easily within and across cloud computing providers.
• Automated: Cloud computing services and resources can be specified, located, and securely provisioned on demand and with zero human interaction.
• Client-aware: Cloud computing services adapt seamlessly to the end user’s device regardless of the type of devices they are using.
Merry Christmas 2011!
The entire VNet Professionals team would like to wish you and your family a Merry Christmas and Happy New Year! We also want to thank you for your loyalty and allowing us to continue being your Information Technology experts of choice.
In celebration of the holiday season VNet Professionals will be closed the afternoon of Wednesday December 22nd, all day Friday December 24th and Friday December 31st!
We truly treasure your business!
All the best to you and a prosperous 2011,
Antonio de la Cerda, President/CEO
Zarita de la Cerda, Business Development Manager
Sheila Biddle, Administrative Assistant
Brandon Contreras, Account Manager
Ben Moore, Network Systems Engineer
Tim Harracker, Network Systems Engineer
Jeremiah Narty, PC Support Specialist
Trick or treat? Jokingly or seriously, Halloween can Damage your PC
Trick or treat? Jokingly or seriously, Halloween can end up really damaging your PC
Computer pranks with applications that simulate a Trojan infection are invading the Web
“Paranormal Activity 2” and “Friday the 13th” used in BlackHat SEO attacks to download malware
Spam aimed at getting clicks and personal data using Halloween icons as bait is being widely used by attackers
As Halloween approaches, applications, fake websites, spam and Trojans all put on a disguise to try to trick users. PandaLabs, the anti-malware laboratory of Panda Security -The Cloud Security Company- has been detecting attacks like these since August. However, these have intensified over the last few days and we are seeing old specimens ‘coming to live’, new strains and fake applications that only attempt to scare users a little bit.
Halloween pranks to spread terror…
Even though computer pranks are nothing new, they get massively distributed in the days leading up to Halloween in order to terrorize users. These applications are actually harmless, as they really do not contain any malware or Trojans.
They usually arrive at the targeted computer from one of the victim’s contacts as a Halloween video file or an online greetings card via email, social networks, etc. However, once you download and install them, they show a series of messages and screens informing you that you have been infected by a Trojan.
On other occasions, it is a flash movie that simulates the deletion of all contents on the computer’s hard disk, while a spooky skull is displayed on the screen. The website that distributes this prank offers a video with instructions to configure the movie in order to make it even more real and scary.
In reality, these are just computer virus hoaxes, as neither have you been infected by any malware nor has your hard disk been formatted. However, there is no doubt that users will be really scared to see their computer almost destroyed!
PC User Scams to Watch Out For
Scam #1: Your computer is infected! The biggest criminal enterprise is the rogue antivirus product. It tries to convince you that your computer is infected so you hand over money for “antivirus protection” - which is not actually protection at all. The minute you see a fake alert, stop everything you’re doing, kill the browser, and perform a full scan with the legitimate antivirus product of your choice. And give us a call immediately for support at (858) 633-1800 so we can verify and help remove the offending virus or malware from your computer. Avoid downtime, loss of data and productivity.
Scam #2: Check out this cool link! Your friend’s email or Facebook account is hijacked, and you receive a brief message with a short URL to watch a video or check out something equally “cool.” The link actually leads to a malicious page with a malware download. Most shortlink services have a feature that lets you preview where the shortlink will go; use it. If you’ve never heard of the Web site, DO NOT GO THERE. When you see a links such as http:\\www.thr67mn.bu$enew0rk.ur/r/?ZXU=136677&Z29798 it is very suspect and you should not engage by clicking on it. DO NOT CLICK on the link, check the true destination domain against a reputation service, such as Webroot’s Brightcloud. And don’t be the first one among your friends to click a link. NO LINK IS THAT IMPORTANT.
Scam #3: John Doe wants to be your friend. In this one, the scammers usually duplicate the message format of popular social network sites. Instead of linking to “friend request,” it takes you to a malicious page instead. To avoid this one, without clicking anything, move the mouse over the link in your email message, then look at the Status Bar to see exactly where the link leads. If the message claims to come from one company, but the URL points to a domain you’ve never heard of (as my example on Scam #2,) DON’T CLICK THE LINK. I would suggest for you to delete the email message then empty the trash folder.
No more Windows XP on new PCs
As TechFlash’s Todd Bishop reminded us, as of today, October 22, PC makers are no longer allowed by Microsoft to preload Windows XP on new PCs.
Netbooks were the last category of PCs on which Microsoft was still allowing XP preloads at this point. Back in April 2008, Microsoft told OEMs that October 22, 2010, would be the day that no more XP Home would be permitted to be preinstalled on new netbooks.
Update: XP preloads are done, but XP downgrades are not, by the way. Best any of us Microsoft watchers can tell, it looks like XP downgrades will be allowed up until 2015. (Microsoft won’t confirm or deny that date.)
Not so coincidentally, today also is the one-year anniversary of the launch of Windows 7, the primary version of Windows which Microsoft is encouraging PC makers to preload on not just PCs, but also the new crop of slates that are coming out. (Hewlett Packard released its long-awaited Windows 7 slate on October 21 — the one that looked at the start of the year that it might be a real iPad competitor, but ended up as a business tablet.)
Microsoft officials said yesterday that in its first year of availability, the company has sold 240 million licenses of Windows 7. Company execs are playing up the new versions of Microsoft’s Windows Live family of add-on services, a new promotional site for Windows 7 applications and hardware (known as Product Scout) and a new Games for Windows Marketplace portal as their Windows 7 updates for this holiday season.
Speaking of Windows Live, I’ve gotten notes from a few readers who are not happy that Microsoft has decided to make the new Windows Live Essentials 2011 bundle something that it is delivering via its Windows Update service. Readers said they consider things like Windows Live Movie Maker, Windows Live Mesh, Windows Live Mail, Windows Live Photo Gallery and the other elements of the suite as nice-to-have add-ons — not something that should be pushed to them via Microsoft’s service which is used primarily to deliver security-focused updates.
But Microsoft is doing just that. Starting October 19, Windows Vista and Windows 7 users who use Windows Update are being offered the Windows Live Essentials 2011 as a “Recommended Update” if they already have any of the included Windows Live software programs installed. Windows Update users who don’t have any of the Windows Live Essentials programs installed on their computers, will also see the update, but it will be marked as “Optional.”
Windows 7, one year later: How’s Microsoft doing? And what’s next?
A year ago today, I was in New York City at the official launch of Windows 7. After a long public beta, and with the released code widely available months earlier, there wasn’t much left to unveil at that point, except for an impressive collection of PCs from OEM partners designed for the new operating system. Most of the Microsoft employees I talked to that day seemed relaxed and genuinely confident. A year later, that confidence is still there. Windows 7 is still selling like gangbusters and the public seems pleased. Back in August, I said: “Windows 7 has been a quiet success, maybe even a phenomenon.” That’s still true.
In my original review, I called Windows 7 “as close to an essential upgrade as I have ever seen,” and I predicted that it would improve with age. A year later, I can already see many of those improvements.
From the standpoint of stability and reliability, Windows 7 has exceeded expectations. The hardware ecosystem was ready, after having been burned badly by Vista, and the Windows Core team did a good job of responding to issues in Windows Vista and Windows Server 2008. With this release, Microsoft might have finally silenced the “Never buy till the first service pack” skeptics. Windows Vista Service Pack 1 was released almost exactly a year after Vista’s consumer launch, and it was desperately needed. Microsoft says it doesn’t plan to finish Windows 7 SP1 until sometime in the first half of next year. That doesn’t seem to bother customers, who have been buying Windows 7 at a rate of 657,000 copies a day over the past year.
One of the biggest under-the-radar improvements to Windows 7 in the past year is the release of Windows Live Essentials 2011. Some reviewers have grumbled about design decisions Microsoft made with the apps in this collection—especially the changes to Messenger—but there’s no question these are full-featured programs, not wimpy starter editions. Photo Gallery is particularly impressive with its extensive set of features for importing, managing, editing, and sharing photos. I don’t think it’s any accident that Apple spent the lion’s share of its time this week on detailed demos of its competing apps in iLife ‘11. I’m looking forward to comparing the two suites when my iLife upgrade arrives in the mail (amazingly, Apple doesn’t offer any way to buy and download iLife).
Even a year later, I continue to be surprised that Windows 7 is so much more efficient than Windows Vista. It uses less disk space than Vista and outperforms it across the board, even on relatively modest hardware.
In the missed-opportunities category, Microsoft deserves special mention for its inability to capitalize on its long history of developing Windows for tablets. Although Windows 7 fully supports touchscreens, the OS itself isn’t well suited for full-time operation with a fingertip. I have three touch-enabled PCs in this house—two all-in-one desktop PCs and a Dell Tablet PC. The touch features feel like a novelty, and I rarely use them. I’m pretty certain that smart people in Redmond are working to make touch features a more natural part of Windows 8, but we’re unlikely to see any of those efforts for at least another year, giving iOS and Android tablets an awfully big head start.
I continue to be amazed and impressed with Windows Media Center. Last week I upgraded our living room Media Center PC with a Ceton InfiniTV tuner, which uses a single CableCARD to tune up to four HD cable channels. (I’ll have a more detailed look at that system next week.) The Media Center interface is fluid and elegant, easily more usable than any alternative, including TiVo, and the whole system has been a joy to use. My sources in Redmond tell me, however, that the Media Center team was essentially disbanded after Windows 7 shipped. I hope that Microsoft is planning a Windows 8 Media Center that will be capable of going head to head with Apple and Google’s TV offerings. If they let that work go to waste, it will be another tremendous missed opportunity.
In the year after Windows Vista was released, I spent an unfortunate amount of time and energy writing posts about how to tweak, tune, and work around its flaws and usability headaches. What I’ve enjoyed most about the last year has been not having to do the same for Windows 7. No, it’s not perfect, but it’s very, very good. Microsoft seems to have figured out, finally, that the best way to design great software is to focus on the user’s experience, not just check off items on a feature list.
If Microsoft follows the playbook and the three-year development cycle it used so successfully for this release, the first beta of Windows 8 will appear roughly a year from now. In fact, the window for feedback that will actually influence the design of the next Windows version is closing soon. What are the flaws in Windows 7 that you want to see addressed? What features are at the top of your must-add list? Leave your comments in the Talkback section.
By: Ed Bott is an award-winning technology writer with more than two decades’ experience writing for mainstream media outlets and online publications.
Top 10 reasons VNet Professionals recommend Outlook 2010
1.- Manage multiple e-mail accounts from one place.
You can easily manage e-mail messages from multiple mailboxes. Synchronize multiple e-mail accounts from services such as Hotmail, Gmail, or just about any other provider to Outlook 2010. Improved connectivity with Microsoft Exchange Server supports the use and management of multiple Exchange Server e-mail accounts in one location.
2.- Manage large volumes of e-mail with ease.
Conversation View in Outlook 2010 improves the tracking of e-mail conversations—reducing information overload—and helps you manage large amounts of e-mail with ease. Entire conversations can be condensed or categorized with a single click. And, new conversation management tools enable you to save valuable inbox space by turning dozens of e-mails into just a few conversations using the Clean Up feature. Or, use the Ignore feature to send the entire conversation to your Deleted Items.
Here’s how to start harnessing the power of Conversation View: On the View tab, in the Conversations group, select Show as Conversations.
3.- Customize common tasks into single-click commands.
Create and save custom actions in a new way with Quick Steps in Outlook 2010. You can save time by creating and defining multistep tasks that you can execute with a single click, including reply and delete, move to a specific folder, create a new e-mail to assigned groups, and more.
4.- Make scheduling a breeze.
Conveniently and efficiently schedule appointments, share your calendar availability and manage your work schedule. With the E-mail Calendar feature, you can send your schedule to others so they can quickly find time for your next appointment. And, the new Schedule View provides a horizontal display of multiple calendars. New calendar management tools enable you to save frequently used groups of calendars so they can be quickly redisplayed whenever you need them.
5.- Search to easily find what you’re looking for.
With Outlook 2010, you can easily sort through high volumes of data. The enhanced Search Tools provide you with ways to quickly find and manage large quantities of e-mail, calendar, and contact items.
6.- Create e-mail messages that capture attention.
Dynamic graphics and picture editing tools are not just for Word and PowerPoint anymore. With Outlook 2010, you can grab your readers’ attention by using compelling visuals such as prebuilt SmartArt™ graphics, Office themes, and Styles. You also can more easily bring your ideas across to your readers by inserting and formatting screenshots in Outlook.
7.- Stay connected to your social and business networks.
Outlook 2010 is your hub for friends, family, and colleagues. Use Outlook Social Connector to get additional information about people, such as mutual friends and other social information, while staying better connected to your social and business circles.1
8.- Ensure that your e-mail messages get to the intended audience.
For business users, sending unnecessary e-mail messages to out-of-office contacts, accidentally replying to a large distribution list, or distributing confidential information outside the company are frequent concerns. With the new MailTips feature, you’re alerted when you are about to send e-mail to a large distribution list, to someone who is out of the office or to individuals outside the organization.2
9.- Receive voice mail previews in your inbox.
With Outlook 2010 and new technology in Exchange Server 2010, a voice-to-text preview of a recorded voice message is sent along with the voice mail recording directly to your inbox. Access your messages virtually anywhere using your computer, Microsoft Outlook Mobile, or Microsoft Outlook Web App.3
10.- Initiate live conversations from Outlook.
Keep in touch with your contacts. By using Office Communicator, or your instant messaging application, Outlook 2010 provides presence and status information for those on your buddy or contact list.4 Hover over a name, see their availability and then easily initiate a conversation directly through instant messaging. With Office Communicator, you can start a voice call without leaving Outlook.
Security is not just Antivirus alone.
Security is not just Antivirus alone. You also need to protect the data on your PC or laptop!
Final Defense protects your data if your laptop (or notebook) ever gets stolen, even if the hard drive is removed. With the ‘Track and Trace’ feature, Final Defense gives you a chance to recover the laptop.
You don’t think it can happen to you? Consider this: over 1 million laptops went missing or were stolen last year. It gets worse every year!
You believe your laptop with password protection is sufficient? Sorry to disappoint you. Thieves can strip all the data from your laptop anyway. Don’t let this happen to your laptop! Protect your data and click on the button below. (of course you can read further how Final Defense will protect your laptop data, 100% garanteed!)
Final Defense protects you against one more form of Identity Theft, keeping your sensitive and personal data safe. It creates the final safety net when all usual precautions to prevent physical theft have been taken but have failed.
As soon as you report that your computer has been stolen, Final Defense will then activate the code to disable the computer immediately when it goes online and immediately prevent access to the information contained on the hard drives.
When your laptop does not make any connection with the internet, it is still protected! FINAL DEFENSE has a TTL (Time To Life) feature that you can set. When your laptop is not used during this TTL period, it will lock automatically and only YOU can unlock and use the laptop.
In all cases of protection Final Defense makes the hard drive useless to others when it is removed from the laptop; an additional security step.
Final Defense is the solution:
Delivers protection ANYWHERE in the world.
Prevents thieves from accessing your data through Auto Activation.
Disables laptop completely from further use by thieves.
Tracks and traces your computer - if you lose it we can help you find it.
Protects your data even when you’re NOT connected to the Internet.
Allows for easy ‘text message’ reporting of a stolen laptop.
Reports the location of your laptop and confirms Remote Data Protection.
Blocks your hard drive from use even when removed from your laptop.
If you have backed up data, even off site, we will protect access to it.
Hispanic Business Showcase Seminar
To WIN be present at the Hispanic Business Showcase Seminar I am presenting:
Saturday, September 11, 2010 at 2:15 pm
San Diego Convention Center | Room 28D (upper floor)
See the Full Schedule for the Hispanic Business Showcase
BIO: Antonio de la Cerda, has been in the Information Technology field for 15 years. He is currently the CEO of VNet Professionals Inc., a Hispanic Minority Owned and Small Business Certified IT Consulting Firm – San Diego’s Leading Network Integrator/Value Added Reseller (VAR) of VoIP Business Class Phone Systems, IT Consulting Services, and Outsourced Managed Services to Microsoft windows environments for small, medium and enterprise businesses. Mr. De la Cerda has experience working on IT projects for such clients as: Petco Park, Padres Ball Park; UCSD; SDSU; Francis Parker School; Rubios, Inc. (Corporate Offices); Islands Restaurants (Corporate Offices); ILA & Zammit Engineering; Luce, Forward, Hamilton & Scripps LLP; Hooters Restaurant of America (Corporate Offices); Escondido Unified School District; Temecula Valley Bank; Keller Williams (Temecula, Riverside and more…); North County Credit Union; City of Carlsbad; Hyundai TransLead (USA, Canada and Mexico), and others.
His Certifications and Education: Microsoft Certified Systems Engineer (MCSE,) Cisco Certified Network Professional (CCNP,) AdTran Technical Support Professional for IP Telephony (ATSP/IPT,) Allworx Certified Professional; San Francisco State University, Information Technology; Heald Institute of Technology; Universidad Autónoma de Baja California, Business Administration.
Follow Antonio de la Cerda
Topic: Basic IT concepts for the Small Business to help your business grow
Saturday, September 11, 2010 at 2:15 pm
San Diego Convention Center | Room 215
See the Full Schedule for the Hispanic Business Showcase
What all business owners/managers should know about the current technology and how to used it to your business advantage.
- Did you know Small business like yours have access to the same million dollar technology, the Fortune 500 companies have at the fraction of the cost?
- VoIP 101 for the business owner.
- Next Generation Technologies to managing a small network. Steps to ensure a reliable network.
- Best new practices in IT business continuity, avoid IT disasters.
- Methods on Maximizing Business Continuity / Avoid Lost Business Opportunities.
- On-Site and Off-Site Backup, what is more important?
Who should attend?
Business Owners, IT Professionals, Executives, Management and anyone that has a Business dependent on a computer network.
Information is key in earning new business. Times have changed so should the methods of running a business. You need to stay ahead with the current demands of business if you do not stay current. I am sure your competition will.
Hispanic Business Showcase
Saturday, September 11, 2010 at 2:15 pm
San Diego Convention Center | Room 215
See the Full Schedule for the Hispanic Business Showcase
Microsoft DLL Hijacking Exploit in Action
The DLL load hijacking vulnerabilities exist in many Windows applications because the programs don’t call code libraries—dubbed “dynamic-link library,” or “DLL”—using the full pathname, but instead use only the filename. Criminals can exploit that by tricking the application into loading a malicious file with the same name as the required DLL. The result: Hackers can hijack the PC and plant malware on the machine.
“Microsoft plans to address those of our products affected by this issue in the most appropriate way for customers,” said Jerry Bryant, a group manager with the Microsoft Security Response Center, in a Tuesday entry on that team’s blog. “This will primarily be in the form of security updates or defense-in-depth updates.”
Although Microsoft again declined to call out its vulnerable software, outside researchers have identified as potential targets a number of its high-profile apps, including Word 2007, PowerPoint 2007 and 2010, Address Book and Windows Contact, and Windows Live Mail.
Other vendors’ software may also be at risk, including Mozilla’s Firefox, Google’s Chrome, and Adobe’s Photoshop.
Microsoft has known of the issue since at least August 2009, when researchers with the University of California Davis notified the company of their work. There’s evidence, however, of reports as far back as 2000, and attacks exploiting the flaw the following year, when the Nimda worm leveraged the bug in Office 2000.
HD Moore, chief security officer at Rapid7 and the creator of the Metasploit penetration testing toolkit, was the first to reveal the potential attacks when, on Aug. 19, he said he’d found 40 vulnerable Windows applications. Moore was followed by other researchers who claimed different numbers of at-risk programs, ranging from more than 200 to fewer than 30.
Some vendors have already patched the problem in their software. Both uTorrent and Wireshark, a BitTorrent client and network protocol analyzer, respectively, have been updated to address the bug.
Others are working on a fix. “We’re testing our own Firefox-specific fixes and plan to get them out to users soon,” Mozilla’s security team said in an e-mail reply to questions last week.
Even so, Microsoft said patches may be long in coming to some users. “We recognize that it may take quite a bit of time for all affected applications to be updated and for some, an update may not be possible,” Bryant admitted.
In lieu of patches, the blocking tool is the best defense, he continued. With that in mind, Microsoft plans to make the tool available “within the next couple of weeks” for downloading and deployment using Windows Server Update Services (WSUS), Microsoft’s most-used business patch management mechanism.
The company is also thinking about pushing the tool to everyone, including consumers, via Windows Update, although it would be switched off by default, said Bryant.
San Diego Business Connectors
Here is a letter from the President of the San Diego Business Connectors welcoming new members to include Antonio de la Cerda from VNet Professionals Inc.
Dear San Diego Business Connectors:
Our La Jolla group is definitely growing - I am so excited! And as you all know, the stronger and bigger our meetings get, the more opportunities there are for you to network, build relationships, and strengthen your business. We are a special group - no attendance requirements or pressure, so we’re just plain more fun! Today’s newcomers were:
Rafael Pinedo, New York Life Insurance Company
Ana Chanelo, Dr. Flowers Vision Institute
Dr. Frances Chen, DDS
Geri Capehart, Guardian Life Insurance
Asmita Patel, CPMS Credit Card Processing
Antonio De La Cerda, VNet Professionals Inc.
Please watch the attached videos!
Here is why we joined the SDBC and why you should too: See Interviews and what other members say about the SDBC.
Meet the organizer:
VNet Professionals at the San Diego County Hispanic Chamber of Commerce Mixer
VNET PROFESSIONALS INC. AN IT CONSULTING FIRM WILL BE EDUCATING COMPUTER USERS ON BEST PRACTICES TO PROTECTING COMPUTER DATA
SAN DIEGO, CA, MAY 27, 2010: VNet Professionals Inc.. will be providing free education and providing best practices to protect and recover your computer data in 30 min after a computer disaster, today at 4:00pm at the San Diego County Hispanic Chamber of Commerce Networking and Mixer event. The events will take place at: The The Hornblower, San Diego; South Broadway Float, 950 N. Harbor Drive, SD 92101
When an IT disaster strikes, whether in the form of a natural disaster, a blackout or far more likely, a common server failure, virus attack or software-user error, your ability to respond well to the crisis can make the difference between a bright future or long days of recovery. Your success of your business of the failure of closing it due to loss of information (computer data.) This informative seminar is sole purpose is to provide you the knowledge of Business continuity is not just about insurance or backup; it’s about knowing without a doubt you can quickly restore a server or data in minutes, including recovery of a down server remotely (without onsite user intervention.)
Antonio de la Cerda, President and CEO of VNet Professionals Inc. will be providing demonstrations and giving away Free Trails copies of the software being demonstrated. If the most important asset of a business is the accumulation of information including contacts, email, accounting information, quotes and data. Then you have realized the importance to protect it. These next generation technologies are the solutions Enterprise companies cannot do without and now we are bringing them to the Small Business to protect their critical data from a hard drive crash, virus attack or a hardware failure.
Who should Attend?
Business Owners, IT Professionals, Executives, Management and anyone that has Critical Data on a Computer and needs to ensure that the business data can be restored in minutes. Learn how fast, easy reliable and cost effective this solution can be and how it can save you and provide you with Business Continuity and Disaster Recovery.
What Attendees will Learn?
•Best new practices in backup and disaster recovery planning for your business and why your current and old methods are outdated and put your business at risk.
•Methods on Maximizing Business Continuity / Avoid Lost Business Opportunities
•Bare Metal Recovery
•On-Site and Off-Site Backup
Do not be a statistics or a risk taker with your business:
• 31% of PC users have lost their files due to events beyond their control.
• 34% of companies fail to test their tape backups, and of those that do, 77% have found tape back-up failures.
• 60% of companies that lose their data will shut down within 6 months of the disaster.
• 93% of companies that lost their data center for 10 days or more due to a disaster filed for bankruptcy
The events will take place at:
The The Hornblower, San Diego; South Broadway Float, 950 N. Harbor Drive, SD 92101
Small Business Server 2008
See the features and benefits video, demonstrating Remote Web Workplace one of many great features build into Microsoft’s Small Business Server 2008. The advantage and competitive edge that Small Business Server 2008 can offer a small business owner with 75 users or less is the difference between being productive and having the upper-hand with your time.
141 tech experts to follow on Twitter, updated for 2010
Twitter can be a valuable tool for techies — if you know who to follow. Here is a list of 140 of the top technology experts, journalists, and thought leaders you’ll can find on Twitter, updated for 2010.
My original list of 100 tech experts on Twitter that I put together last year drew a lot more attention and interest than I expected. However, in the time since I first did that list I’ve had a number of people tell me about some of the great tech Twitterers that I left off the list, plus I’ve discovered some new people to add.
As a result, I’ve expanded the list to 140 (a magical number in the Twitter universe). A number of people who should not have been left off have now been added and you’ll also see some people you’ve probably never heard of, but who can add some excellent tech info to your Twitter stream.
For updates and perspective on the latest tech news you can also follow me on Twitter at http://twitter.com/jasonhiner.
So here is the list, which is not ranked 1-140 but simply listed in alphabetical order. If there are others you think should be added to the list, make a note in the comments.
1.Chris Anderson (@chr1sa) Editor in Chief of Wired and author of The Long Tail
2.Michael Arrington (@techcrunch) Founder of TechCrunch
3.Matt Asay (@mjasay) COO of Ubuntu and Open Source columnist for CNET
4.John Battelle (@johnbattelle) Author and pundit on Google and Internet search
5.Veronica Belmont (@veronica) Host of Tekzilla and Qore, and former CNET TV host
6.Randall Bennett (@randallb) Founder of TechVi; former CNET TV producer
7.David Berlind (@dberlind) TechWeb Editor-in-Chief
8.Tim Berners-Lee (@timberners_lee) Inventor of the World Wide Web
9.Ryan Block (@ryan) Former Engadget editor and co-founder of GDGT
10.Henry Blodget (@hblodget) Controversial Wall Street journalist who covers tech sector
11.Danah Boyd (@zephoria) Academic/researcher in new media
12.Ed Bott (@edbott) Microsoft Windows expert, blogger, book author
13.Paul Boutin (@paulboutin) Reporter for VentureBeat, The New York Times, and Wired
14.Tony Brandley (@tonys3kur3) Freelance tech writer speciallizing in security
15.Rick Broida (@cheapskateblog) CNET blogger scours the Web looking for the best deals in tech
16.Jason Calacanis (@jasoncalacanis) CEO of Mahalo, founder of Weblogs Inc.
17.Pete Cashmore (@mashable) CEO of Mashable
18.Bonnie Cha (@bonniecnet) CNET mobile phone pundit
19.Jacqui Cheng (@eJacqui) Associate editor for Ars Technica
20.Robert Cringley (@cringely) Long-time technology writer and pundit
21.Brian Cooley (@briancooley) CNET car-tech editor
22.Charles Cooper (@coopeydoop) Veteran reporter for CNET news.com and cbsnews.com
23.Dan Costa (@dancosta) Executive editor at PC Magazine
24.David Davis (@davidmdavis) Author, blogger, expert on Cisco and virtualization technologies
25.Chris Dawson (@mrdatahs) ZDNet blogger on technology in education
26.Natali Del Conte (@natalidelconte) CNET TV host of Loaded and tech correspondent for CBS News
27.Mrinal Desai (@mrinaldesai) Co-founder of CrossLoop; tech news junkie
28.Sam Diaz (@sammyd) ZDNet news hound on the Between the Lines blog
29.Larry Dignan (@ldignan) ZDNet Editor in Chief; prolific tech news blogger
30.Cory Doctorow (@doctorow) Co-editor of Boing Boing; digital rights activist
31.Esther Dyson (@edyson) Veteran technology pundit
32.Matt Cutts (@mattcutts) Google engineer, blogger
33.Bill Detwiler (@billdetwiler) TechRepublic’s head technology editor
34.John C. Dvorak (@therealdvorak) Famously cranky tech pundit
35.Erik Eckel (@erikeckel) IT consultant and TechRepublic writer
36.Mike Elgan (@mike_elgan) Widely-published freelance tech writer
37.Philip Elmer-DeWitt (@philiped) Apple reporter for Fortune magazine
38.Rob Enderle (@enderle) Long-time analyst of the PC industry
39.Caterina Fake (@caterina) Co-founder of Flickr
40.Dan Farber (@dbfarber) Editor of CBSNews.com; former editor of CNET and ZDNet
41.Scot Finnie (@scotfinnie) Editor in Chief of Computerworld
42.Mary Jo Foley (@maryjofoley) One of the world’s top commentators on Microsoft
43.Ina Fried (@inafried) CNET’s resident Microsoft analyst
44.John Furrier (@furrier) Silicon Valley entrepreneur; now specializing in mobility
45.Bill Gates (@billgates) Microsoft co-founder and former CEO
46.Steve Gillmor (@stevegillmor) Editor of TechCrunch IT, veteran tech journalist
47.Bob Gourley (@bobgourley) CTOvision.com blogger; government IT expert
48.John Gruber (@gruber) Author of Daring Fireball blog; covers mostly Apple
49.Dion Hinchcliffe (@dhinchcliffe) Blogger and consultant on Web 2.0 for business
50.Chuck Hollis (@chuckhollis) EMC CTO and blogger
51.Alex Howard (@digiphile) Enterprise tech editor; excellent tech news source on Twitter
52.Andy Ihnatko (@ihnatko) Apple pundit
53.Jeff Jarvis (@jeffjarvis) Professor and author who covers tech and new media
54.Mark Kaelin (@markwkaelin) TechRepublic editor covering Windows and PCs
55.Mitch Kapor (@mkapor) Lotus, Mozilla pioneer; angel investor
56.Guy Kawasaki (@guykawasaki) Venture capitalist and former Mac columnist
57.Doug Kaye (@dougkaye) Founder of IT Conversations
58.Vinod Khosla (@vkhosla) One of the tech world’s most influential venture capitalists
59.Michael Krigsman (@mkrigsman) Watchdog of IT project failures
60.Sarah Lacy (@sarahcuda) Freelance author covering Silicon Valley
61.Leo Laporte (@leolaporte) Host of TWiT network and former TechTV host
62.Brian Lam (@blam) Editorial Director of Gizmodo
63.Shira Lazar (@shiralazar) Web video journalist covering the intersection of tech, culture, and new media
64.Nicole Lee (@nicole) CNET editor for mobile and other gadgets
65.Jennifer Leggio (@mediaphyter) ZDNet blogger on social media for business
66.Steven Levy (@stevenjayl) Tech book author and Wired writer
67.Cali Lewis (@calilewis) Host of GeekBrief.TV
68.Charlene Li (@charleneli) Author and social media thought leader
69.Jim Louderback (@jlouderb) CEO of Revision3; former editor of PC Magazine
70.Scott Lowe (@scottdlowe) CIO, author, and TechRepublic columnist
71.Abbie Lundberg (@abbielundberg) Former editor in chief of CIO Magazine
72.Andrew Mager (@mager) Web developer and ZDNet blogger on Web 2.0
73.Om Malik (@om) Founder of GigaOm
74.Amber MacArther (@ambermac) Tech journalist and broadcaster
75.Richard MacManus (@rww) Editor and founder of ReadWriteWeb
76.John Markoff (@markoff) Science writer for The New York Times
77.Marissa Mayer (@marissamayer) Google product development executive
78.Caroline McCarthy (@caro) CNET writer covering Web 2.0
79.Harry McCracken (@harrymccracken) Founder of Technologizer and former editor of PC World
80.Tom Merritt (@acedtect) Host of Buzz Out Loud and various CNET TV shows
81.Matthew Miller (@palmsolo) ZDNet blogger on smartphones
82.Clayton Morris (@claytonmorris) Fox TV personality covering geek topics and social media
83.Walt Mossberg (@waltmossberg) Tech columnist for The Wall Street Journal
84.Matt Mullenweg (@photomatt) Founder of WordPress
85.Rafe Needleman (@rafe) Editor of CNET’s Webware
86.Patrick Norton (@patricknorton) Tekzilla host and former TechTV personality
87.Andrew Nusca (@editorialiste) ZDNet news writer; SmartPlanet.com editor
88.Tim O’Reilly (@timoreilly) Founder and CEO of O’Reilly Media
89.Jeremiah Owyang (@jowyang) Forrester analyst on new media technologies
90.John Paczkowski (@johnpaczkowski) Tech news hound for All Things Digital
91.Nilay Patel (@reckless) Engadget editor
92.Jason Perlow (@jperlow) ZDNet blogger and Linux Magazine writer
93.Chris Pirillo (@chrispirillo) Tech geek turned Internet personality
94.Jason Pontin (@jason_pontin) Editor in Chief of MIT Technology Review
95.David Pogue (@pogue) Tech columnist for New York Times and CNBC
96.Seth Porges (@sethporges) Tech editor at Popular Mechanics magazine
97.JR Rafael (@jr_raphael) Tech news writer for PC World
98.Maggie Reardon (@maggie_reardon) CNET reporter on mobile and wireless technology
99.Don Reisinger (@donreisinger) Gadget columnist for CNET
100.Gabe Rivera (@gaberivera) Founder of Techmeme
101.Tim Robertson (@mymac) Podcaster; Founder of MyMac.com
102.Peter Rojas (@peterrojas) Founding editor of both Gizmodo and Engadget
103.Kevin Rose (@kevinrose) Founder of Digg.com, host of Diggnation
104.Joshua Schachter (@joshu) Creator of Delicious, a.k.a. del.icio.us
105.Jack Schofield (@jackschofield) Computer editor at The Guardian
106.Erick Schonfeld (@erickschonfeld) TechCrunch editor
107.Robert Scoble (@scobleizer) Tech writer and social media flag-bearer
108.Sascha Seagan (@saschasegan) Mobile writer for PC Magazine
109.Doc Searls (@dsearls) Tech journalist, author, open source advocate
110.Stephen Shankland (@stshank) CNET News reporter, covering Web and search
111.Deb Shinder (@debshinder) Popular tech tip writer for TechRepublic and other publications
112.Dwight Silverman (@dsilverman) Technology editor for the Houston Chronicle
113.John Siracusa (@siracusa) Apple writer for Ars Technica
114.Jason Snell (@jsnell) Editorial Director of Macworld
115.Joel Spolsky (@spolsky) Co-founder of Stack Overflow
116.Mark Spoonauer (@mspoonauer) Editor in Chief of LAPTOP
117.Brad Stone (@bradstone) Technology reporter for The New York Times
118.Robert Strohmeyer (@rstrohmeyer) PC World editor and columnist
119.Kara Swisher (@karaswisher) Silicon Valley blogger for AllThingsD.com
120.Don Tennant (@dontennant) Former editor in chief of Computerworld
121.Paul Thurrott (@thurrott) Microsoft Windows columnist, editor, and podcaster
122.Baratunde Thurston (@baratunde) Editor, writer, and comedian; one of the funniest techies on Twitter
123.Kevin Tofel (@kevinctofel) Managing Editor at jkOnTheRun, mobile/smartphone expert
124.Joshua Topolsky (@joshuatopolsky) Editor in Chief of Engadget
125.Gina Trapani (@ginatrapani) Founding editor of Lifehacker.com
126.Dan Tynan (@tynan_on_tech) Tech humor columnist and veteran tech writer
127.Lance Ulanoff (@lanceulanoff) Editor in Chief of PC Magazine
128.Rick Vanover (@rickvanover) Senior IT professional and TechRepublic blogger
129.Tony Vincent (@tonyvincent) Writer on mobile tech and IT in education
130.Werner Vogels (@werner) Amazon.com CTO
131.Ariel Waldman (@arielwaldman) Blogger on tech and space technology
132.Jack Wallen (@jlwallen) Linux enthusiast, columnist, and tip writer
133.Padmasree Warrior (@padmasree) CTO of Cisco Systems
134.Seth Weintraub (@llsethj) Computerworld columnist covering Google and Apple
135.Fred Wilson (@fredwilson) Tech venture capitalist in New York
136.Dave Winer (@davewiner) “The father of blogging and RSS” (BBC)
137.Alex Wolfe (@awolfe58) Editor in Chief of InformationWeek
138.Molly Wood (@mollywood) CNET TV host and writer; creator of the famed “Molly rant”
139.Dave Zatz (@davezatz) Gadget and digital lifestyle blogger
140.Jonathan Zittrain (@zittrain) Author and Harvard professor covering the Internet
141.Antonio de la Cerda (@VNetPros) Founder of VNet Professionals Inc. and technologist.
Cloud, it’s a web thing
Having read (hat-tip Dennis Howlett) Randy Bias’ article at Kendallsquare on Debunking the “No Such Thing as a Private Cloud” Myth I have to say — rather like the apocryphal Irish direction-giver — if I’d wanted to make a case for private cloud, I wouldn’t have started from there. Randy and I joined a civilized conversation a few weeks back as a follow-up to my earlier post on this topic, and I fear he’s already forgotten every dam’ thing I said. So I guess I’ll have to reiterate it.
But first, let me (shock, horror!) make the case, such as it is, for private cloud. It looks like we’re stuck with the term, along with all the ugly implementations that are going to be classed under it and which I fear will ultimately lead to its becoming discredited — unfairly dragging the reputation of true cloud computing through the same mud as it does so. As you can see, I still distrust the term, mainly because it is so open to misinterpretation, and I shan’t be using it myself without heavy qualification and many caveats — which, by the way, should make for an interesting panel discussion with Verizon, IBM and others that I’ll be moderating at the All About Cloud event in San Francisco this May [see disclosure]. But I do see circumstances where it’s possible to make a case for implementing cloud-like infrastructure in a private environment, and Randy, despite starting his exposition from completely the wrong starting point, does end up making a statement about private cloud with which I can heartily agree:
“The private cloud model is a critical transitional step. It is an essential component to help larger organizations move their compute capacity to the public cloud.”
The thing that cloud purists and evangelists are too prone to forget is that most enterprises are heavily committed to existing investments in pre-cloud, on-premise infrastructure. These are assets they simply can’t afford to throw away or retire just yet. Very few organisations are lucky enough to be able to start over with a clean sheet and move everything to the cloud in one fell swoop. Therefore, for the next few years, the vast majority of them are going to have a hybrid IT infrastructure — some of it in the cloud, some of it not. This was an important takeaway, by the way, from my interview with SAP CTO Vishal Sikka, which I published recently. They’re going to need a way of bridging the two, and that’s where some kind of cloud-like private infrastructure may come in useful, to mediate between what’s already in the cloud and the other IT assets that are either transitioning towards the cloud or remaining on-premise.
Where Randy and I fundamentally disagree, however, is in our interpretation of the words ‘private cloud’ and in what we each regard as the key characteristics of this transitional infrastructure (nor am I prepared to join him in dignifying the notion with the status of a ‘model’). Randy insists that cloud computing is essentially a business model (pay-as-you-go outsourcing) built on top of an architectural model (shared virtualized infrastructure), and completely ignores — no worse, attempts to deny — that it has anything to do with the Internet.
Yet in my view, the most important attribute of the cloud — too readily overlooked by many commentators — is that it lives in the Internet. The Internet dimension is crucial because it brings with it an obligation and a necessity to remain open to connections. It means that a cloud has to have:
Collective scrutiny and innovation
The third of these is probably the most difficult to grasp and yet the most far-reaching in its impact. Any infrastructure or application service that lives on the Web as a shared resource is constantly tested by two separate yet complementary schools of users:
Skeptics that don’t trust it
Enthusiasts that want to push the envelope of what’s possible
Those two interest groups have a virtuous push-me, pull-you effect on the provider’s infrastructure or application that ensures that it’s constantly staying up-to-date both with every threat that might bring it down and with every emerging enhancement that could make it better. These dual competitive forces impel a cloud platform to evolve in ways that private platforms can never cost-justify. Anyone that designs a perfectly state-of-the-art cloud platform and deploys it to a private environment — even if that private environment is shared by thousands of distinct user organisations (and that’s a tiny minority case) — cuts it off from the competitive pressures that ensure it continues to evolve and protects it from gradual yet inexorable decline into obsolescence.
Therefore, the only use case that I believe makes sense for private cloud is one where it acts as a temporary transition chamber. Either as a controlled environment where IT assets can be prepared for subsequent deployment to a fully cloud existence, or to mediate between public cloud assets and those left operating within the private enterprise environment. A private cloud that helps IT assets move towards the public, Internet-immersed cloud, I can live with. Anything that’s designed instead to somehow avoid connecting to the wider Web is just missing the point.
By Phil Wainewright is a commentator and strategist on emerging software industry trends.
Three arrested over 12.7m PC botnet
Authorities in Spain have arrested three men accused of operating a massive botnet composed of 12.7 million PCs that stole credit card and bank log-in data and infected computers in half of the Fortune 1,000 companies and more than 40 banks, according to published reports.
The botnet “Mariposa,” which means butterfly in Spanish, first appeared in December 2008 and grew to be one of the largest botnets ever, The Associated Press reported. It spread the Butterfly worm via removable drives, MSN Messenger, and peer-to-peer programs and targets Windows XP and older systems.
Unlike many underground hackers, the alleged ringleaders of the operation were not skilled programmers, but had contacts who were, authorities said.
“They’re not like these people from the Russian mafia or Eastern European mafia who like to have sports cars and good watches and good suits—the most frightening thing is they are normal people who are earning a lot of money with cybercrime,” Cesar Lorenza, a captain with Spain’s Guardia Civil, which is investigating the case, told the news service.
In Spain, names and mug shots of arrested citizens are not released to protect their privacy, though they were identified by their Internet aliases: “netkairo,” 31; “jonyloleante,” 30; and “ostiator,” 25. They face up to six years in prison if convicted of the hacking charges.
More arrests are expected, authorities said. The botnet is no longer operating, according to the AP report.
The cloud slide Steve Ballmer should have shown
There have been lots of blogs, tweets and news stories covering Microsoft CEO Steve Ballmer’s talk at the University of Washington on March 4 about Microsoft’s commitment to cloud computing.
I didn’t intend to write about Ballmer’s hour-plus presentation because there was no news. But the more I thought about his talk, the more I felt it merited comment — at least in terms of the seeming intent behind his words.
Ballmer highlighted a variety of products — everything from Windows Phone 7, to Bing Maps, to the Natal gaming controller — and touted all of these as proof that Microsoft is a leader in cloud computing. Ballmer said 70 percent of Microsoft’s workforce is currently engaged in cloud-computing or cloud-related activities and by next year, that percentage would be 90 percent.Based on his presentation, Ballmer seemingly was using the terms “cloud” and “Internet” interchangeably. But to me, the Web is not the same as the cloud. Then again, maybe I’m just splitting hairs…
I understand that there is no single cloud. Is Microsoft Hotmail a cloud app? Sure, it runs in Microsoft’s datacenters somewhere. Ditto with Xbox Live, the Danger Sidekick services, Office Web Apps, Windows Live services, Microsoft’s hosted Business Productivity Online Services (BPOS), etc. There are lots of Microsoft servers different Web-based apps and services out there, all of which could be called part of “the cloud” even though none of the ones I’ve mentioned is running on Microsoft’s Windows Azure.
Oh yeah. Azure …. When most pundits and industry observers talk about Microsoft and its cloud strategy, they mean Azure. I bet a lot of Microsoft’s customers and developers think this way, too. Ballmer made very few references in his UW talk today to Azure — maybe because on the Azure cloud front, Microsoft is playing catch up (at least timing-wise) to others already out there, including Amazon, Google, Salesforce and more.
Ballmer said Microsoft would support the public cloud, the customer (private) cloud, the partner cloud and the governement cloud. Until today, I felt Microsoft’s story about how it would do this was pretty clear and straightforward. It was software+services and/or three-screens-and-a-cloud. According to that “story,” Microsoft offers users a wide span of choices: Run your applications on-premises; partially on-premises and partially in the cloud; or completely in the cloud. On the cloud side, these applications can be hosted by Microsoft partners and/or Microsoft.
In other words, like this. (Click on the slide below to enlarge. Note IaaS is Infrastructure as a service and PaaS is platform as a service.):
So why did Ballmer deviatie from this script now and trying to broaden the definition of Microsoft’s cloud? Some Microsoft-employee blog post write-ups, post-Ballmer’s presentation, provided a few clues.
The one I found that offered the most coherent and plausible explanation for Ballmer’s more amorphous cloud definition came from Microsoft Senior Product Manager Andrew Kisslo on his “Why Microsoft” blog. Microsoft wants folks to understand the company has been doing “cloud computing” before February 1, 2010 (the date that it began charging customers for Windows Azure) and that “experience maters.” The Redmondians want users to know Microsoft is fully committed to the cloud, and that it’s a “leader” in cloud computing, not a follower.
I realize the slide above was probably a little too geeky and dry for the UW audience. A demo of Bing Maps was surely a lot more fun than a discussion of blobs and Web/Worker roles. But I think Microsoft is taking a dangerous route in trying to over-simplify its cloud strategy in the hope of being perceived as the leader instead of a follower. Not everything should get the “consumerization of IT” treatment….
What do you think? Did Ballmer convince anyone today that Microsoft is the established leader in the cloud? Or did Microsoft’s new messaging backfire?
By Mary-Jo Foley has covered the tech industry for more than 20 years
Pay as you go computing: A viable way forward?
Microsoft has allowed the renting of Windows and Office, allowing users to pay for these highly popular technologies for a flat fee per year. This would greatly benefit low income but high piracy areas, but could in theory extend to students who only need Windows and Office for two or three years of their degree programme.
If this were to widen further to not only operating systems and software, but to computers, laptops or netbooks and other portable technologies, could this be a future trendsetter?
There are up sides and down sides to this. Neither one is right nor wrong, and depending on personal circumstance to licensing restrictions, it may or may not work. Nevertheless, it’s a business model that could be exploited - and which Microsoft is doing so with its patent installed - and others could easily follow suit.
It could be cheaper overall (…or not)
If you know how long you will be renting a computer, say two or three years for a degree programme, it could well be cheaper to annually renew your licence for an operating system and the rest of it. Depending on time of course, it could work in your favour.
Then again, the main cost imposed for students is the hardware itself. As a student you will benefit from all kinds of offers and price reductions because you’re automatically assumed to be impoverished and living off nothing but baked beans and a cold cut of bacon.
That is all good and well - only if you decide upon a Windows machine. For Apple users, the operating system ties in with the hardware and you’ll still pay through your teeth for it, even with their offers available.
And of course, if you are using open-source software then you only have to pay for the hardware; massively cutting down the price you would have spent overall.
Limited in hardware due to software requirements
Again with the Mac vs. Windows debate, if you are a die-hard Mac user you may not be able to use the operating system that you are used to as the two come as a pair. Also, depending on the type of operating system and version available on rental license - such as a 32-bit edition of Windows XP - this may limit you to a non 64-bit computer even if you need one.
Manufacturers aren’t going to build machines where the hardware outweighs the software.
The cloud and not being tied down
Microsoft has hit the cloud like a storm, although now they are charging for it. Amazon is one of the main competitors with its A3 service, and Google is brewing something as Apple completes the final stages of its iDatacenter build. Cloud computing is going mental - but in a good way.
With rental computers, there’s a good chance this cloud business could take off a storm, and storing files in the cloud instead of on the local device means you won’t be tied down to one computer. It also means big bucks for the industry and the transferability between devices when one inevitably breaks.
By Zack Whittaker, the youngest in the ZDNet network.
Can your IT services firm handle your tech recycling or refurbishment needs?
OK, so I know that many GreenTech Pastures readers probably deal with at least one outside vendor that handles some aspect of their IT work, whether its hardware deployment or systems integration or project rollouts. Some of the larger services organizations, particularly those allied with the tier one hardware vendors (Hewlett-Packard, Dell, IBM, etc.) has built up some sort of asset disposition teams.
But not everyone is big, and very few IT services firms have built up the relationships to do this on their own. That’s why distributor Tech Data has stepped in to ally itself with TechTurn, which is a provider of services focused on technology recycling, refurbishment and remarketing. I like to think of those three activities as the new “three R’s.”
The time is pretty cool, actually, if you think about it, because I am sure that many of you are nursing systems that are four or five years old now. With Windows 7 hanging out there, this may be the first time that you’ve been faced with the problem of how to get rid of your old hardware responsibly and how to eke as much out of it as you can along the way.
If you’re a VAR or IT solutions provider reading this entry, this is another good reason that you might decide to work with Tech Data on a project rather than another distributor. TechTurn is pretty serious about working with IT services firms that it knows are out there in front of customers and it signed up a Microsoft executive last fall to handle this. TechTurn is closely aligned with Dell and Microsoft, to boot.
TechTurn meets all the requisite checkoff items that you need out of a technology recycling company, and it touts the fact that it holds an R2/RIOS certification for electronics recycling. I had to look this up, because I didn’t know what it was. But essentially, there are two things at play here. The R2 thing is a standard for Responsible Recycling Practices that was developed collectively by a lot of different stakeholders including the U.S. Environmental Protection Agency, OEMs including Lenovo and Dell, the Information Technology Industry Council, the Electronic Recyclers’ Industry Association and a number of state and local governments. The second thing, RIOS, is an integrated system for managing environmental, quality and health and safety concerns.
TechTurn is actually the first company to earn both certifications worldwide.
By Heather Clancy is an award-winning business journalist in the New York area with more than 20 years experience covering the high-tech industry.
Is cloud computing hype, or something riskier?
Ask a group of eight IT industry analysts what 2010 will bring, and you will get 100 different answers.
In the latest BriefingsDirect podcast, ZDNet compatriot Dana Gardner attempted just that, and one topic seem to rise to top the list above all others: cloud computing.
Is it because cloud computing is lighter than air, or is it really a tectonic plate shift? While four of us (including myself) put cloud on top of the list, two others poured cold water on the trend. (Two others did not mention cloud as a major 2010 trend.)
Jason Bloomberg, for one, says most larger enterprises are not ready for the cloud. “I just don’t see cloud computing striking it big in 2010,” he said. “When we talk to enterprise architects, we see a lot of curiosity and some dabbling. But, at the enterprise scale, we see too much resistance in terms of security and other issues to put a lot of investment into it.” Smaller organizations, on the other hand, are more inclined to sign on to the cloud.
Tony Baer also pooh-poohed the rise of cloud in 2010, noting that cloud is “not going to be the ‘new normal’ [as I said it would be in my set of predictions].” Expect to see the same demons we wrestled with outsourcing over the years, Tony said. “We’re going to see this year an uptake of all the management overhead of dealing with cloud and virtualization, the same way we saw with outsourcing years back, where we thought we’d just throw labor costs over the wall.”
Brad Shimmin, who sees a banner year ahead for cloud, nevertheless echoed Jason’s view that cloud will be more likely seen among smaller enterprises. With vendors offering hybrid, premise/cloud, and appliance/service offerings, “it’s going to really let companies, particularly those in the small and medium business (SMB) space, work around IT constraints without sacrificing the control and ownership of key processes and data, which in my mind is the key, and has been one of the limiting factors of cloud this year.”
Dave Linthicum, captain of the cloud, is bullish on the concept, but warns that 2010 may see its share of thunderheads. He foresees cloud crashes making headlines this year.
By Joe McKendrick is an author, consultant and speaker specializing in trends and developments shaping the technology industry.
Why RAID 6 stops working in 2019
Three years ago I warned that RAID 5 would stop working in 2009. Sure enough, no enterprise storage vendor now recommends RAID 5.
They now recommend RAID 6, which protects against two drive failures. But in 2019 even RAID 6 won’t protect your data. Here’s why.
The power of power functions
I said that even RAID 6 would have a limited lifetime.
. . . RAID 6 in a few years will give you no more protection than RAID 5 does today. This isn’t RAID 6’s fault. Instead it is due to the increasing capacity of disks and their steady URE rate.
Late last year Sun engineer, DTrace co-inventor, flash architect and ZFS developer Adam Leventhal, did the heavy lifting to analyze the expected life of RAID 6 as a viable data protection strategy. He lays it out in the Association of Computing Machinery’s Queue magazine, in the article Triple-Parity RAID and Beyond, which I draw from for much of this post.
The good news: Mr. Leventhal found that RAID 6 protection levels will be as good as RAID 5 was until 2019.
The bad news: Mr. Leventhal assumed that drives are more reliable than they really are. The lead time may be shorter unless drive vendors get their game on. More good news: one of them already has - and I’ll tell you who that is.
The crux of the problem
RAID arrays are groups of disks with special logic in the controller that stores the data with extra bits so the loss of 1 or 2 disks won’t destroy the information (I’m speaking of RAID levels 5 and 6, not 0, 1 or 10). The extra bits - parity - enable the lost data to be reconstructed by reading all the data off the remaining disks and writing to a replacement disk.
The problem with RAID 5 is that disk drives have read errors. SATA drives are commonly specified with an unrecoverable read error rate (URE) of 10^14. Which means that once every 200,000,000 sectors, the disk will not be able to read a sector.
2 hundred million sectors is about 12 terabytes. When a drive fails in a 7 drive, 2 TB SATA disk RAID 5, you’ll have 6 remaining 2 TB drives. As the RAID controller is reconstructing the data it is very likely it will see an URE. At that point the RAID reconstruction stops.
Here’s the math:
1 - 1 /(2.4 x 10^10)) ^ (2.3 x 10^10) = 0.3835
You have a 62% chance of data loss due to an uncorrectable read error on a 7 drive RAID with one failed disk, assuming a 10^14 read error rate and ~23 billion sectors in 12 TB. Feeling lucky?
RAID 6 tackles this problem by creating enough parity data to handle 2 failures. You can lose a disk and have a URE and still reconstruct your data.
Some complain about the increased overhead of 2 parity disks. But doubling the size of RAID 5 stripe gives you dual disk protection with the same capacity. Instead of a 7 drive RAID 5 stripe with 1 parity disk, build a 14 drive stripe with 2 parity disks: no more capacity for parity and protection against 2 failures.
Digital nirvana, eh? Not so fast, my friend.
Grit in the gears
Mr. Leventhal points out is that a confluence of factors are leading to a time when even dual parity will not suffice to protect enterprise data.
Long rebuild times. As disk capacity grows, so do rebuild times. 7200 RPM full drive writes average about 115 MB/sec - they slow down as they fill up - which means about 5 hours minimum to rebuild a failed drive. But most arrays can’t afford the overhead of a top speed rebuild, so rebuild times are usually 2-5x that.
More latent errors. Enterprise arrays employ background disk-scrubbing to find and correct disk errors before they bite. But as disk capapcities increase scrubbing takes longer. In a large array a disk might go for months between scrubs, meaning more errors on rebuild.
Disk failure correlation. RAID proponents assumed that disk failures are independent events, but long experience has shown this is not the case: 1 drive failure means another is much more likely.
Simplifying: bigger drives = longer rebuilds + more latent errors -> greater chance of RAID 6 failure.
Mr. Leventhal graphs the outcome:
Courtesy of the ACM
By 2019 RAID 6 will be no more reliable than RAID 5 is today.
The Storage Bits take
For enterprise users this conclusion is a Big Deal. While triple parity will solve the protection problem, there are significant trade-offs.
21 drive stripes? Week long rebuilds that mean arrays are always operating in a degraded rebuild mode? Wholesale move to 2.5″ drives? Functional obsolescence of billions of dollars worth of current arrays?
Home users can relax though. Home RAID is a bad idea: you are much better off with frequent disk-to-disk backups and an online backup like CrashPlan or Backblaze.
What is scarier is that Mr. Leventhal assumes disk drive error rates of 1 in 10^16. That is true of the small, fast and costly enterprise drives, but most SATA drives are 2 orders of magnitude less: 1 in 10^14.
With one exception: Western Digital’s Caviar Green, model WD20EADS, is spec’d at 10^15, unlike Seagate’s 2 TB ST32000542AS or Hitachi’s Deskstar 7K2000.
By Robin Harris has been messing with computers for over 30 years and selling and marketing data storage for over 20 in companies large and small.
Ten things we still don’t know about Microsoft’s next-gen Windows Phones
Microsoft has just concluded its February 15 press conference at the Mobile World Congress in Barcelona, where company officials showed off a demo unit of a Windows Phone 7 Series phone.
Those of us watching the Webcast got a good idea of the “Metro” user interface that will be on these phones. If you’re a Zune HD user, you won’t have any trouble figuring out the interface, which includes the same “hub” and “pinning” concepts Microsoft pioneered with the Zune HD. We know there is going to be a dedicated Bing hardware button on the phones. We know Version 1 of these phones won’t have support for Adobe Flash.
We also know who’s on the list of Windows Phone 7 phone makers (LG, Samsung, HTP, HP, Dell, Sony Ericsson, Toshiba, Garmin, Asus and Qualcomm) and carriers (T-Mobile, Orange, Verizon, VodaFone, Sprint, Verizon, Telefonica, AT&T and Telstra, among others).
But there are still lots of things we still have no idea about, after Microsoft’s hour-and-a-half mobile-strategy presentation.
Here are ten of just a few of my still-unanswered questions (some of which I’m hoping to get more details on later today):
1. When are the Windows Phone 7 phones coming to market — other than by “holiday 2010″ as the Softies said today? Does that mean September, October, November? Will all of the Windows Phone 7 handsets be out this year, or are some going to lag into 2011?
2. Will Windows Phone 7 phones be able to multitask? Microsoft showed a video criticizing its competitors for lacking multitasking, but no one said these new phones will multitask. Instead, Microsoft execs said users would be able to access multiple apps via “hubs.” But that doesn’t necessarily mean multitask.
3. Which version of Internet Explorer is the Windows Phone 7 Series running? Microsoft execs said it’s a more advanced version of IE than the company has ever shipped. But they didn’t say which version of IE it is.
4. What’s the operating system inside the Windows Phone 7s? Microsoft execs never said the words “Windows Mobile 7″ (and I don’t believe that Windows Phone 7 is the name of the OS, contrary to what some bloggers have asserted recently). Which version of Windows CE is at the core of the OS? CE 6.0 R3 (Cashmere?) or something more or less current?
5. Which Windows Live services will come “standard” as part of the Windows Phone 7 experience? Microsoft officials said “some” of them will be. I’d assume Hotmail, Windows Live Messenger and Windows Live Photo Gallery. But what about Live Mesh, Movie Maker, Live Writer?
6. There were a lot of rumors about Microsoft offering a “business” version of Windows Mobile 7 and a “consumer” version of it. At today’s event, Microsoft officials emphasized that users want one phone to handle consumer and enterprise tasks. Does this mean that the very consumer-oriented UI for Windows Phone 7 is the only UI? Or will users get a business option at some point? And no mentions of the Office Mobile 2010 product today also made me wonder how/when Microsoft and its partners will make that next-gen Office suite available to Windows Phone 7 users. (Remember The Office Mobile 2010 release is a iIndows Mobile 6.5 app… so if there’s no backward compatibility, it’s unclear how/when it will support the new Windows Phone 7s.)
7. What about the promised over-the-air updates for Windows Mobile 7? What about Silverlight support for Windows Mobile 7? Microsoft execs have been on record in the past committing to these features but no mention of either today.
8. Does the fact that all Windows Phone 7 devices will be Zune music/video-capable (as Microsoft execs said today on stage) mean that there are no more dedicated Zune HD players coming from Microsoft? Is the Zune HD, introduced in the fall of 2009, the last of that line? (Update: I asked Microsoft and got a no comment on this one.)
9. What happens to My Phone (Microsoft’s backup/recovery service)? Windows Marketplace for Mobile? No mentions of either of these during today’s event makes me suspicious about their future. (Update: Windows Marketplace for Mobile becomes one of the hubs in Windows Phone 7 devices; sounds like it will not include Windows 6.x apps.) And what about the new release of “Dorado,” the Zune app for PCs? Will that support Windows Phone 7 devices?
10. The elephant in the room (Pink) was not mentioned or alluded to at all during today’s MWC event. I had heard late last week that Microsoft is planning a separate Pink rollout event for the Pure and Turtle phones made by Sharp — possibly this spring. How does the Pink “experience” integrate with/relate to the Windows Phone 7 one, if at all?
Beyond the development tools strategy/vision for Windows Phone 7 (which Microsoft execs have said will be a topic for the Mix 2010 show in mid-March), what else do you want to know about Windows Phone 7?
Update: I got a few answers and lots of “no comment — yet” responses to some of these questions when I had a chance to talk to Microsoft after its event was over today.
By Mary Jo Foley, she has covered the tech industry for more than 20 years.
Windows 7 setup secrets
As of May 5, the general public is finally allowed to download the official Windows 7 Release Candidate. It’s been up on BitTorrent networks since mid-April, and developers with MSDN or TechNet subscriptions have had access to it since April. But those groups constitute a tiny fraction of the people who are seeing the Windows 7 release candidate for the first time with its public release. (You can find downloads and installation instructions at Microsoft’s website.)
For the benefit of the early adopters and those who patiently wait, I’ve been gathering information on the right and wrong ways to set up Windows 7. For the past week or so I’ve been installing and upgrading the RC code on a wide variety of systems—notebooks and desktops, with and without touch and tablet capabilities, with and without TV tuners and Blu-ray drives, as clean installs and upgrades, in x86 and x64 flavors, documenting the process.
In this post, I want to share seven of the lessons I’ve learned along the way, including a few setup secrets that even some Windows experts don’t know about.
Secret #1: Choose the right Setup option
Secret #2: Start with a clean disk
Secret #3: Back up your old drivers first
Secret #4: Do a nondestructive clean install
Secret #5: You need less disk space than you think
Secret #6: Unblock the upgrade path for Windows 7 beta
Secret #7: Unlock those extra editions
By Ed Bott is an award-winning technology writer with more than two decades’ experience writing for mainstream media outlets and online publications
Microsoft offers Windows XP, Office XP users 50 percent discount to encourage upgrades
Microsoft officials are well aware that its biggest Windows 7 and Office 2010 competitors are its own previous product iterations (Windows XP and Office XP/2003). To try and wean users away from older, “good-enough” releases, Microsoft is introducing a new licensing promotion.
The revamped “Up to Date Discount” program is targeted at small/mid-size business (SMB) customers running older versions of Windows and Office. Between January 1 and June 30 of this year, Microsoft is enabling users running Windows XP or Vista (on the operating system side) and Office XP, Office 2003 or Office 2007 (on the productivity suite side) to receive a discount of 50 percent on the cost of their licenses for Windows 7 and Office 2007 (or Office 2010, once it is released by June 2010).
The 50% discount calculations “are based on estimated retail prices and reseller prices may vary,” Microsoft officials acknowledge. But the Softies say U.S. customers who sign up for the program “would be paying $35.00 for a Windows 7 Professional Upgrade and/or $91.00 for Office 2007 Professional Plus in year 1, plus receiving all of the Software Assurance benefits (such as an automatic upgrade to Office 2010 when it launches, Office Home Use Rights, and much more) for that price.”
As you’d expect, there are lots of caveats. First,customers get the 50 percent discount only for the first year of their Open Value Subscription (OVS) payment. (OVS is a Microsoft licensing program, introduced last year for SMBs, which allows users to pay for software licenses over time and includes many of the same provisions as Microsoft’s Software Assurance licensing program.) The new deal applies only to those customers using the Professional versions of Windows and/or the Professional versions of Office.
The new promotion, which Eric Ligman, Global Partner Experience Lead with Microsoft’s Worldwide Partner Group, announced via the Microsoft SMB Community Blog on January 1, goes beyond the current Up-to-Date Discount offer. Before the new so-called “N-2″ update to the program was put in place, Windows XP users and Office XP users were ineligible for the discount. But Microsoft is now offering users of the older Windows and Office releases coverage if they’re willing to sign up for the Open Volume Subscription plan.
Meanwhile, speaking of new Microsoft licensing promotions, Microsoft is introducing “version 4.0″ of another SMB promotional licensing offer, known as “The Big Easy,” according to Ligman.
Starting January 3, SMB customers can increase dollars available for them to spend with Microsoft partners “by purchasing multiple qualifying product groups, adding Software Assurance to their orders and/or acquiring advanced, premium or Enterprise editions of the MIcrosoft Solutions.” To qualify, customers need to buy products through their Microsoft partners between January 3 and March 31 via the Microsoft Open License, Open Value and/or Open Value Subscription programs.
Products included under the program include Dynamics CRM, Office Communications Server, SharePoint Server, SQL Server, Systems Center and Windows Server, among others.
10 Linux features Windows should have by default
The battle between Linux and Windows will most likely rage on for years to come. I can foresee that even when all things migrate to the cloud, users in both camps will still be screaming the virtues of their favorite operating system. And, of course, I will be one of those campers (and I can bet you know just which camp I’ll be in). But being in that camp does not preclude me from seeing the benefits and strengths of the Windows operating system.
In my next two 10 Things articles, I am going to take pieces of each operating system and place them in the other. In this first article, I am going to share 10 features from the Linux operating system that should be in the Windows operating system. In the next article, I will go the other way.
Now you should know, features will encompass literal features as well as systems and even philosophies. I don’t want to leave anything out of the picture. In the end, my hope is that theoretically, at least, we’ll have a much more ideal operating system. Of course, you can (and will) be the judge of that. Let’s get going and start adding Linux features to Windows.
No matter how clean Aero gets, I am not a fan of the flat, single-workspace desktop of Windows 7. Yes, it has come a long way, but it’s not nearly the modern desktop that Compiz offers. Of course, many would argue that Compiz is nothing more than eye candy. I, on the other hand, would argue that many of the features Compiz offers are just as much about usability as they are eye candy. Having a 3D desktop that offers you quick access (via key combinations) to multiple workspaces is handy. Window switchers can’t be beaten for ease of use. And the eye candy is just a bonus. Having Compiz on top of Windows would certainly take the experience to a level few Windows users have experienced.
Yes I know you can have multiple accounts on a Windows 7 box, but that doesn’t make it truly multi-user. Can you log on more than one user at a time in Windows 7? Not by default. To have concurrent user sessions for Windows 7, you have to download a third-party tool. In Linux, you can do this by default. This is a feature that should be enabled by default in Windows 7, too.
3: Log files
Windows operating systems have plenty of tools that enable the administrator to read log files. But for system, administration, and security issues, the administrator must fire up the tools to see those log files. But Linux places all system log files in /var/log and allows the user (with the right permissions) to read these log files from a simple text editor. And the Linux log files are flexible in many ways. For instance, if I want to follow a system log, I can open that log in a terminal window with the tail -f command and watch as events occur.
4: Centralized application installation
The new paradigm for Linux is a centralized location for installation. The Ubuntu Software Center is turning out to be the culmination of much of this work. From one source, you can search from hundreds of thousands of applications and install any one you need. And with upcoming releases of the Ubuntu Software Center (version 3 to be exact), commercial software will be available.
I am a big fan of Cron. Cron jobs enable you to easily automate tasks. Yes, you can add third-party software on a Windows operating system to help automate tasks, but none will have the flexibility of the cron job. Cron allows you to schedule as many tasks as you like, at any time you like, from a simple command-line tool (or a GUI tool, if you so desire). And cron is available system wide — for both administrative tasks and standard user tasks. Having an automated system built in would certainly be handy.
6: Regular release cycle
This is one of those areas where Microsoft could learn a serious lesson from the Linux camp. Most Linux distributions release their updated distributions on a regular basis. And even better, they stick to these schedules to the best of their ability. Take Ubuntu, for example. For each release there is a .04 and a .10 version. The .04 version is released on the fourth month of the year. The .10 version is released on the 10th month of the year. This happens like clockwork. So Ubuntu 10.04 will release April 2010 and Ubuntu 10.10 will release October 2010. Granted sometimes those releases don’t start populating the mirrors until the last second of that month, but they are as regular as they can be.
7: Root user
Let’s face it — by default, the average user can do too much in Windows. So much so, it becomes simple for someone to write a nasty little virus that can be spread simply by opening up an attachment in an email. With the way Linux is set up, this doesn’t occur. For damage to be done to a system, generally speaking the root password must be known. For example, if a user clicked on an attachment from an email, and that attachment demanded the root (or sudoers) password, that would be a quick indication that the attachment was malicious. Windows should separate the administrative user and the standard user by default. The first thing Windows users should have to do, upon starting up their new computer for the first time, is create an administrative password and a user password.
Okay, I’m not going to say Windows should be free. What I am going to say is that it should have one version and one price (with a nod to bulk pricing). Why do I say this? Simple. Which version should you buy? Do you need Premium or Ultimate? Which sounds better? Is “premium” better than “ultimate”? Here’s an idea — just have one version for the desktop and one for the server. It works for Linux. Less confusion and frustration for the consumer, less advertising waste for Microsoft. And all those features that cause the most expensive version of Windows 7 to be thus — the average user wouldn’t know how to use them anyway.
9: Installed applications
I know that Microsoft doesn’t include any useful applications (minus a browser) by default for a reason — to make money. But when I install Linux for the average user, I’m done. I don’t have to install an office suite, an email client, or audio/visual tools. Outside of installing financial applications and the odd power-user tool (which is all handled in a single, centralized location — see #4), there’s nothing more to do once the OS installation is done. Microsoft could at least include Word.
10: Hardware detection
Before anyone gets bent out of shape, this is not what you’re thinking. Let me set this up for you. What happens when you install a Windows operating system and something doesn’t work? Say, for example, video. You thought for sure the OS would support your video card, but when the installation is complete you’re stuck with good old 800×600 resolution. So you go to the device manager to see if you can find out what the card is, and you get nothing. How are you supposed to find out what drivers to download when Windows gives you no information? Oh sure, you can open up the case and check out the chipset. Or you might get lucky and find that device driver CD lying around. But what if you can’t? Or what if that video is on board?
If you were using Linux you could at least issue the dmesg command and get some information right away. And if dmesg didn’t help out, you could always fire up the Hardware Drivers tool, which will might discover a proprietary driver you could use. In Windows, if you don’t know the card, you’re going to have fun finding the drivers. Although Windows hardware support is better, Linux hardware detection is better.
Those are 10 features I would like to see make the jump from Linux to Windows. Do you agree? Is there a feature listed you think might hinder the Windows operating system? Is there a Linux feature not listed that you would like to see jump the fence? If so, let us know. Next time: 10 Windows features I’d like to see in Linux. No, really. By Larry Dignan is Editor in Chief of ZDNet and Editorial Director of ZDNet sister site TechRepublic.
Will Chrome netbooks really be competitive?Acer, which last quarter overtook Dell to become the world’s second-largest PC manufacturer, is known as an aggressive company. So it’s little surprise that the company’s chairman, J.T. Wang, vowed this week that Acer would be first to release a netbook using Google’s Chrome OS sometime in the middle of next year. How big a deal this really is remains to be seen.
Acer is the same company that, back in June, promised it would be first to ship a netbook using Google’s other OS, Android. In October, Acer made good on its promise by releasing a version of its Aspire One D150, a 10.1-inch netbook, with Android. But it turned out to be a dual-boot system that relied–like most netbooks–on Windows XP. Android simply provides an alternative way to access to a few functions such as Gmail, Web browsing and Google Calendar–in effect it works like the other Linux-based pre-boot operating systems found on several laptops and netbooks. Since Android was really designed for smartphones, it also has some serious limitations as a netbook OS. For example, the Firefox browser does not support Flash, and you can’t open attachments, edit documents, access a USB drive or print.
There’s good reason to believe that Chrome OS will be a more formidable challenger. Google is clearly taking the time to make sure that Chrome has the basic performance, security and features that users expect in a netbook by the time it ships sometime in the second half of 2010. And it is working with a broader group of PC manufacturers including not only Acer, but also Asus, HP, Lenovo, Toshiba and, it seems, Dell. Though they are refer to future products as netbooks, Google says it is pushing for slightly larger displays–most likely ranging from 11.6- to 13.3 inches–and full-size keyboards to improve usability, which means they’ll probably look similar to the low-cost, ultra-thin laptops that are becoming more prevalent. All of this should add up to a more competitive product.
Despite its flaws, the x86 Windows ecosystem is proving to be very hard to replace. Nokia is just the sort of company you’d expect to release a so-called smartbook that shakes things up, yet it’s Booklet 3G uses an Intel Atom processor and runs Windows 7. That’s not say Google won’t make some headway. By this time next year Acer and others could well have several interesting Chrome OS netbooks on store shelves. In a world filled with near-identical netbooks, it would be nice to have a little variety. But clearly there’s a lot of work left to be done on both the software and hardware. By John Morris is a former executive editor at CNET Networks and senior editor at PC Magazine.
Why “good enough” simply isn’t with laptops
The New York Times published its annual catalog of the Year in Ideas. One of them, Good Enough is the New Great, is a concept derived from a story in the August issue of Wired (The Good Enough Revolution), which noted that some of the most successful gadgets and applications of late are a triumph of mediocre technology over the latest and greatest.
The Flip’s success stunned the industry, but it shouldn’t have. It’s just the latest triumph of what might be called Good Enough tech. Cheap, fast, simple tools are suddenly everywhere. We get our breaking news from blogs, we make spotty long-distance calls on Skype, we watch video on small computer screens rather than TVs, and more and more of us are carrying around dinky, low-power netbook computers that are just good enough to meet our surfing and emailing needs. The low end has never been riding higher.
Lately I’ve been thinking a lot about this concept of good-enough computing. The success of netbooks in 2009 seems like an obvious example of good over great, but I’m not convinced that is what the netbook phenomenon is really all about. I think it has a lot more to do with demand for highly mobile computing at an affordable price. No sooner had netbooks hit the big-time then chipmakers began trying to address performance shortcomings such as the inability to play HD video. AMD’s ultra-thin platform (formerly known as Congo), Nvidia’s Ion chipset and Intel’s upcoming Pine Trail platform are all designed to boost performance of netbooks.
In general, I think there’s still room for significant improvement in performance and battery life of laptops. It’s true that the typical $600 mainstream laptop on the shelf at Best Buy can handle most tasks. And there’s more choice than ever in terms of size and weight, price, and performance. But if you think about it, we’re still far from having it all in one laptop. Netbooks and thin-and-lights based on ultra low-voltage chips are highly portable, and have excellent battery life, at the expense of performance. Budget and mainstream laptops are priced right and have decent performance, but they are too bulky and battery life is poor. If you really want the best performance, you can choose notebook with an Intel Core i7 quad-core (Clarksfield) processor, but these are generally available only in expensive 17-inch desktop replacements that are marginally portable and designed largely for gamers. (Yes, Dell’s 15-inch Alienware M15x also comes with Core i7, but it weighs in at more than 10 pounds.)
HP deserves credit for trying to build a laptop that has it all, but the Envy 15 illustrates just how hard impossible this is to do using current technology. The Envy 15 is reasonably portable, measuring one inch thick and weighing 5.4 pounds, and it has a Core i7 quad-core processor and a 1920×1080 15.6-inch display. But all that comes at a price. The Envy 15 starts at $1,800 with a 1.60GHz Core i7-720QM, 6GB of memory, ATI Mobility Radeon HD 4830 graphics and a 500GB hard drive. The Envy 15 is also one of the few 15.6-inch laptops you’ll find that doesn’t include an internal optical drive to keep the size and weight down. And while performance was extremely good on CNET’s tests, battery life was not. Numerous other reviews noted that the Envy 15 runs so hot that it is actually “uncomfortable to use.”
The near-term roadmap doesn’t offer much hope for closing this gap. In early 2010, Intel will release its first 32nm Westmere processors, including Arrandale for laptops and Clarkdale for desktops, but these will be designed for mainstream laptops, where the bulk of the sales are. In addition, these processors will include Intel’s integrated graphics on the same chip for the first time. This will simplify system design, and should help lower prices, but unless Intel has made huge strides in the performance of its integrated graphics, these Arrandale-based laptops won’t satisfy power-users. Of course, the 32nm chips can also be paired with discrete graphics, and later in the year we should get faster Westmere chips that come a bit closer to Clarksfield. AMD will release its first laptop platform (Danube) with a quad-core processor in the first half of next year, but like Clarksfield, AMD’s 45nm chip is likely to be too big and hot for anything but desktop replacements. Meanwhile AMD’s next ultra-thin platform, Geneva, and the Fusion chip with an integrated GPU that arrives in 2011, are targeted at the mainstream, and not the performance segment. The bottom line: Don’t look for a thin 13.3-inch laptop that offers Core i7-level performance and solid battery life anytime soon.
Last week, I attended a semiconductor conference where chipmakers discussed their latest technology. There was a lot of talk at the show about a coming slowdown in the pace of innovation. Each new generation of process technology is tougher and more costly than the last, the argument goes, so does it really make sense to stay on this treadmill? But Intel execs–who noted (repeatedly) that the company has already shipped more than 200 million processors using high-k and metal gate technology while the rest of the industry is still figuring it out, and is already manufacturing 32nm chips in two factories–said they have no plans to let up. That’s good news because today’s laptop technology isn’t anywhere near good enough.
By John Morris is a former executive editor at CNET Networks and senior editor at PC Magazine.
Nine ways IT can help organizations ‘go green’ and reduce paper consumption
Commentary - This holiday season, think about ways to “give back” to the environment.
The liquid in printer cartridges - which carries a price tag of about $10,000 per gallon - costs far more than the most expensive bottle of champagne any of us will buy over the next few weeks. And despite the popularity of recycling, each year millions of empty toner and inkjet cartridges used in laser printers, fax machines, and copiers are thrown in the trash, destined for landfills and incinerators.
As more enterprises look for ways to ‘go green,’ many do not realize that re-aligning basic information technology (IT) practices can help play a part in becoming more environmentally responsible.
One way for IT teams to help reduce waste is to implement new approaches within daily processes. It’s not enough to reduce the amount of paper we use, as beneficial as that is to the environmental and the cost of doing business. As business processes move toward being completely electronic, enterprises need to think about ways to reduce our “paper footprint.”
Continued use of paper to record critical business transactions can weigh down organizations because of the cost of paper and printing, compliance risks and the environmental challenges of disposing of paper. Yet, there are fairly simple steps that organizations can take to reduce paper consumption. They are:
Use business analytics software: Integrate software that automates manual reporting and analysis, and electronically distributes reports over the Web or on mobile devices. One mid-size company estimates that it saved enough paper to cover 5,519 football fields on a yearly basis simply by moving manual-based financial and operational reporting processes to a business intelligence system.
Re-align business processes: Automate and streamline business processes among people and systems, reducing paper consumption by eliminating unnecessary papers trails and content storage costs.
Move business tasks to an electronic format: Encourage non-technical employees to try electronic forms and survey software that does not require an IT department’s resources. Traditionally, compiling forms and surveys required several technical workers weeks, not minutes, at a significant cost in an IT department’s time and salaries. For example, electronic forms are currently used by more than 1.4 million Army personnel worldwide, yielding a projected $1.3 billion in cost savings to the U.S. federal government.
Monitor and regulate printing: Encourage employees to edit and review documents in electronic form, while promoting a paper-free environment. For example, don’t ask employees to print meeting agendas. Instead, use whiteboard or laptops to take note during meetings.
Eliminate the unnecessary printing of documents: Prevent IT teams from writing and then printing massive documents that are quickly out-of-date as requirements change. Use software to make requirements an electronic process, providing teams with the ability to visually capture requirements for a project using sketches, storyboards, comment threads and rich-text editors. An IBM “No Paper Weight” study indicates that when companies stop printing their “born digital” documents, paper consumption can be reduced by 80 to 90 percent.
Review software code – online: Don’t print out code for “code review” - like proof reading a paper for grammar. Worldwide, more than 80 billion lines of code are written annually, representing a “mountain” of paper. Manual inspection is time-consuming and error-prone. IDC estimates the cost of fixing software defects at $5.2 million to $22 million annually, depending on an organization’s size.
Increase Data Center Capacity: Grow the capacity of an enterprise’s data center while reducing spiraling energy costs through facilities design, power and cooling infrastructure, active energy management and efficient, scalable systems.
Introduce Collaboration Tools: Use team collaboration software that lets people share links instead of attachments or hard copy documents, reducing storage and paper requirements. Use mobile devices. Today more than ever, as mobile software applications have grown in popularity, employees can complete most all of their business tasks by using their mobile devices. They can review, read and work on documents and other business tasks while on the go, reducing the amount of forms they might have printed in the past.
Consumption of large amounts of paper within organizations can lead to redundancy, increased costs, increased time and decreased quality. By making a New Year’s resolution to make at least some of these simple strategies, organizations can take steps to improve business processes and cost savings, while embracing “green IT,” making themselves a more socially responsible and attractive employer and vendor.
Leslie L. Gordon is vice president of Application and Infrastructure Service Management in the Office of the CIO of IBM.
Your car, your hot spot and Ford’s tech utopia
Patch Tuesday: Microsoft plugs IE ‘drive-by download’ security holes
Microsoft today shipped six bulletins with patches for a total of 12 documented security vulnerabilities in a wide range of widely deployed software products. Three of the six bulletins are rated “critical,” Microsoft’s highest severity rating.
The most serious issues affect the company’s Internet Explorer browser, including the newest IE 8 on Windows 7.The Internet Explorer bulletin (MS09-072) covers five documented vulnerabilities that affect all supported versions of the browser (IE 5, 6, 7 and 8). As previously reported, there is public exploit code available for one of the IE vulnerabilities.
[ SEE: Exploit published for critical IE zero-day flaw ]
Here’s why this is considered a high-priority update for all affected Windows users:
The vulnerabilities could allow remote code execution if a user views a specially crafted Web page using Internet Explorer.
An interesting sidebar: All five of the IE vulnerabilities were purchased by a third-party company that buys software flaw information in exchange for the exclusive rights to broker the disclosure process with affected vendors.
This month’s Patch Tuesday batch also covers two potential worm holes in Microsoft Windows (Internet Authentication Service). The update (MS09-071) patches critical flaws that could allow remote code execution if messages received by the Internet Authentication Service server are copied incorrectly into memory when handling PEAP authentication attempts.
An attacker who successfully exploited either of these vulnerabilities could take complete control of an affected system. Servers using Internet Authentication Service are only affected when using PEAP with MS-CHAP v2 authentication.
The third critical bulletin (MS09-074) addresses a security flaw in the Microsoft Office Project software. The vulnerability could allow remote code execution if a user opens a specially crafted Project file.
Microsoft also shipped three “important” bulletins to cover the following:
MS09-069: Vulnerability in Local Security Authority Subsystem Service. This security update resolves a privately reported vulnerability in Microsoft Windows. The vulnerability could allow a denial of service if a remote, authenticated attacker, while communicating through Internet Protocol security (IPsec), sends a specially crafted ISAKMP message to the Local Security Authority Subsystem Service (LSASS) on an affected system.
MS09-070: Vulnerabilities in Active Directory Federation Services. Resolves two privately reported vulnerabilities in Microsoft Windows. The more severe of these vulnerabilities could allow remote code execution if an attacker sent a specially crafted HTTP request to an ADFS-enabled Web server. An attacker would need to be an authenticated user in order to exploit either of these vulnerabilities.
MS09-073: Vulnerability in WordPad and Office Text Converters. Patches a privately reported vulnerability in Microsoft WordPad and Microsoft Office text converters. The vulnerability could allow remote code execution if a specially crafted Word 97 file is opened in WordPad or Microsoft Office Word. An attacker who successfully exploited this vulnerability could gain the same privileges as the user. Users whose accounts are configured to have fewer privileges on the system could be less impacted than users who operate with administrative privileges.
Microsoft’s Security Research & Defense blog offers this nifty chart to help Windows users prioritize the deployment of the other updates appropriately.
Bulletin Most likely attack vector Bulletin severity Max Exploit- ability Index Likely first 30 days impact Platform mitigations
MS09-072 (IE) Attacker hosts a malicious webpage, lures victim to it. Critical 1 Public exploit code already exists for CVE-2009-3672 affecting IE6 and IE7. We expect to see exploits for other vulnerabilities that affect other IE versions within 30 days. DEP is enabled by default for IE8 on Windows XP SP3, Windows Vista SP1 and later, Windows Server 2008, and Windows 7.
DEP makes exploiting the public vulnerability significantly more difficult.
MS09-073 (Wordpad converter) Attacker sends malicious .doc file (saved in legacy Word version 8 format) to victim who opens it in Wordpad. Critical 2 Less likely to be exploited in first 30 days. Affects only older platforms.
MS09-071 (IAS) Attacker on a wireless LAN attacks the Microsoft IAS server providing the 802.1x authentication and encryption via PEAP. Attack would be via the RADIUS protocol. Critical 2 Less likely to be exploited in first 30 days.
MS09-074 (Project) Attacker sends a malicious Project file (MPP) to victim who opens it with Project 2003 or earlier. Critical (Critical on Project 2000 only) 2 Less likely to be exploited in first 30 days. Affects only older versions of Project.
MS09-070 (ADFS) Attacker able to authenticate to ADFS running in IIS can execute code within the IIS worker process. Important 1 While an exploit may be developed in the first 30 days, the risk to most organizations is low because attack surface is only exposed to authenticated attackers.
MS09-069 (LSASS) Attacker on enterprise network authenticates to a server and remotely causes CPU exhaustion. Important 3 Unlikely to be exploited in first 30 days. No chance of code execution
My Windows 8 wish list
As I’ve said before, I like Windows 7. In fact, I like the OS a lot. It reminds me a lot of the good ol’ NT4 days. I wouldn’t go as far as to say that I’ve fallen in love with Windows again because times are different and I enjoy a polyamorous existence where I use several different OSes. But Windows 7 has reminded me of the fact that when Windows is done right, it can be a cracking OS.
But all this emotion directed at a big pile of 0s and 1s doesn’t mean that I don’t see room for improvement. In fact, I’ve already drawn up a Windows 8 wishlist, which I’ll share with you here.
Look, it’s the 21st century. Installing the OS on one drive and setting it up so that the data is stored on another drive should be a trivial matter that’s handled during setup. The current installer is simply prehistoric and I hope to see dramatic improvements in Windows 8.
Better support for compressed file formats
There are are a number of very good, free, open source tools for handling compressed files out there. My favorite is 7-Zip and it’s capable of handling all sorts of exotic archives. However, I’m still sort of surprised that apart from supporting .ZIP archives, Windows still can’t handle any other commonly used compressed file format.
This seems like a no-brainer to me. Microsoft should bring all the security software under one application in the next incarnation of Windows.
Updated Task Manager
The Windows Task Manager is a very useful tool. However, it’s very long in the tooth and overdue for a revamp (apart from some minor additions, it’s the same Task Manager that was present in NT4). It doesn’t need to be as complex and fully-featured as Sysinternal’s Process Explorer, but more features could be useful.
Software install center
Microsoft has a lot of cool, free software on offer but unless you know where to look for it, you’ll never find it. Linux distros such as Ubuntu have a Software Center where users can download new stuff from. Microsoft needs something similar, along the lines of how it delivered Ultimate Extras to users.
End to 16-bit/32-bit support
There a time and a place for dumping legacy support and moving on. Windows 8 should be that time.
If you can already update certain Linux distros without requiring a reboot, we should be able to do the exact same thing in the next version of Windows.
Yes, Windows 7 allows me to natively burn .ISO files to disc, but why do I have to scrabble around like a raccoon in a dumpster looking for a tool in order to be able to create and mount these files? Again, the next version of Windows should be able to handle .ISO files natively.
Adrian Kingsley-Hughes is a technology journalist and author who has devoted over a decade to helping users get the most from technology.
Microsoft ends discounted Windows 7 Family Pack deal (but maybe not for good)
When Microsoft launched its Family Pack deal for Windows 7 this fall, officials said the offer would be a limited-time one. They declined to say when the company would phase out the Family Pack offer, but it now appears that day has come, as first reported by Windows SuperSite blogger Paul Thurrott.
(Microsoft did the same with Vista; the Family Discount deal for Vista Ultimate that it offered back in 2007 was a temporary one.)
The Windows 7 Family Pack allowed users to buy three copies of Windows 7 Home Premium, as of late October, for $149.99 in the U.S. and other select markets. But according to the Microsoft Family Pack Web site, “The Windows 7 Family Pack offer has ended.” (Computerworld says the offer ended on December 1, with availability curtailed a week before.)
Microsoft’s marketing strategy around Windows 7 has been one meant to create excitement and demand. One way to do this is to make discounted offers “for a limited time only.” Hurry up and get it or you’ll lose out! Don’t delay! Buy now! and all that jazz. Ending the promotion just before Christmas smacks of Grinchism, as my blogging colleague Ed Bott noted — a trade-off that might do more harm than good (unless good is measured solely by how many more dollars it helps Microsoft rake in from holiday shoppers).
Here’s one thing to remember, though: Throughout the development of the product, the Windows 7 management also has been emphasizing “we’re listening to our users.” (Examples: Here and here.) So even though Microsoft isn’t saying it plans to resume the promotion, don’t be too surprised if the Softies do an about-face in the coming weeks and decide to offer again for a limited time more copies of the Windows 7 Family Pack. (Especially if enough users threaten to dump Windows for Mac OS X.)
By Mary Jo Foley has covered the tech industry for more than 20 years.
10 Windows features I would like to see in Linux
I recently shared my list of 10 Linux features I think should be included in Windows. Today, I’m going to challenge myself by finding 10 features in the Windows operating system that I would like to see make their way to Linux. I am not going to play the typical fanboy and make a joke of this by saying there is nothing in the Windows operating system that would be welcome in the other camp. We all know there are plenty of outstanding features in the Windows operating system. But I might stretch the nature of the word “features” to include a few items that are less inherent in the OS and more about the community or business model.
So with that said, let’s dive into this ocean and see what we catch.
Note: This article is also available as a PDF download.
I have to start with the big guns. There may be only one IT-related business with a better marketing machine than Microsoft — Apple. But that’s a big “may be.” And everyone knows how small and inefficient the Linux marketing machine (or lack thereof) is. I feel fairly confident in saying that if the Linux operating system could enjoy the marketing that Windows enjoys, no other operating system would stand a chance.
2: Hardware support
I say this somewhat half-heartedly, because the hardware support Linux enjoys has come such a long, long way. But there are still areas where it could use a huge bump. Specifically, wireless. Most often a lack of a working wireless connection in Linux is due to having an unsupported chipset. And although the list of unsupported chipsets is getting smaller and smaller, it still exists. When potential new users come across an issue like this, they inevitably run back to Windows because they know their hardware will work. They may have to spend an hour (or a day) looking for drivers, but they know they can get it to work.
3: Smart phone syncing
Regardless of the type of smart phone you use, one of the biggest benefits of using it is that you can sync it with your PC. At least you can in Windows. Many smart phones quickly become slightly crippled when plugged into a Linux machine. Even my HTC Hero, which uses the Android operating system, can’t sync with Linux. Yes, you can add music to your Android phone. But try to sync contacts, calendars, or email with Evolution or KMail and you’re in for a never-ending nightmare. On the Windows operating system, this task is a complete no-brainer.
4: Enterprise presence
On so many levels, Linux is a perfect match for SMB and enterprise usage. Be it the desktop or the server, Linux could help improve the efficiency of workers. But that has not and could not happen without some real change. Exactly what that change is, I am not sure. But I do believe most of the change needed is on the end of the business — and we all know that is not going to happen. But if Linux could enjoy the presence that Windows has in the enterprise, the whole landscape of IT (from business to home use) would change.
5: Workgroup setup
I can get Samba set up pretty quickly, but that is after years of working with Linux. The average user would seriously be put to task to get this working. Joining a Windows machine to a workgroup is simple. Linux needs to gain this user-friendly ability to see and work with Windows machines with very little setup (and especially no editing of smb.conf).
One of the big to-do’s with Windows 7 was the improved touchscreen support. Linux can work with a limited number of touchscreens (see #2), but to do so often requires the user to work with the xorg.conf file. And since X11 is now working with a xorg.conf-less setup, this is even more difficult. Although I’m not a fan, touchscreen could be the future of computing. It has worked majestically for the iPhone, so why not for the desktop PC? If that’s the case, Linux better get some Windows-like support worked into the picture.
This could easily dethrone #1 from the top spot. A handful of companies (System76, for instance) offer pre-installed Linux solutions. If anything would give a better boost to Linux acceptance than pre-installs, I’d like to know what it is. Pre-installed operating systems are what gets the OS into the hands of the user. Sure, anyone can install an operating system if they want to (and have the IQ to do so - and we’re not looking at Sheldon Cooper levels of IQ), but this doesn’t happen on regular basis.
This is a tough one. If you have a problem with Windows you can call Microsoft tech support (so long as you have the time). If you have a problem with Linux, who ya gonna call? You can call Canonical for Ubuntu support, if you’ve purchased a support package. You can call Novell for SuSE support, if you’ve purchased a support package. You can call Red Hat for Red Hat support, if you’ve purchased a support package. But what happens when you buy that shiny new computer, wipe off Windows, install Linux, and have a problem? Most likely, you’re going to hear that you have invalidated the warranty or support contract by doing so. PC makers need to learn to support the Linux operating system.
9: Software installation
I want to preface this by saying the Ubuntu Software Center will eventually negate this point. But for now, we’ll continue on as if USC doesn’t exist. To install an application on Windows, you simply download the installer and double-click the file. To install an application on Linux, you have to search for the application in a tool like Synaptic, mark it for installation, and apply the changes. After you click Apply, you have to hope that all dependencies have been met. And if you can’t find the software within Synaptic (or whichever tool you use), you have to add the repositories that house the software you need. I am a big fan of how Linux is evolving (thanks to tools like Synaptic and Ubuntu Software Center). New users expect to be able to download a single file and double-click it to install.
10: Direct X
One issue that keeps many features from migrating to Linux is Direct X. What would this do for Linux? In a word, games. Games are the reason so many people will not migrate to Linux. There are a lot of gamers out there, and until Direct X comes to Linux, those games will not find their way outside of any operating system that does not support Direct X.
By Larry Dignan is Editor in Chief of ZDNet and Editorial Director of ZDNet sister site TechRepublic.
Windows 7 is less secure than Vista
According to a well respected security firm, Microsoft’s flagship Windows 7 operating system is less secure in its default configuration that Vista.
Trend Micros CEO Raimund Genes believes that Microsoft has put usability ahead of security:
“I’m not saying Windows 7 is insecure, but out of the box Vista is better.”
“I was disappointed when I first used a Windows 7 machine that there was no warning that I had no anti-virus, unlike Vista. There are no file extension hidden warnings either. Even when you do install anti-virus, warnings that it has not been updated are almost invisible.”
“Windows 7 may be an improvement in terms of usability but in terms of security it’s a mistake, though one that isn’t that surprising. When Microsoft’s developers choose between usability and security, they will always choose usability.”
Interestingly, Genes believes that the XP Mode feature present in some editions of Windows 7 actually improves security because it makes available a sandboxed OS. Other security firms (in particular Sophos) have criticized XP Mode, labeling it a security risk because it needs to be patched separately.
So, for a more secure Windows 7, Trend Micro recommends upping your UAC setting higher.
By: Adrian Kingsley-Hughes is a technology journalist and author who has devoted over a decade to helping users get the most from technology
Microsoft and PC makers readying more Windows 7 systems for small businesses
I run a (very) small business — a freelance-writing business of one. When I was looking to buy a new Windows 7 PC this past fall, I have to admit I was underwhelmed.
I saw lots of shiny new PCs aimed at retail/consumer customers with super glossy displays and nail-polish colors. And I saw lots of plain-vanilla, pricey machines that seemed to be targeted at corporate users who needed higher-end features like the ability to join securely a corporate network. But I didn’t see much or hear much about Windows 7 machines for folks working in businesses with a handful of PCs. I wanted a stylish but professional, lightweight PC with substantial RAM and disk space that wasn’t optimized for playing games and watching movies — and that wasn’t in the $2,000-plus price range. Did these kinds of machines exist — beyond my check-list dreams?
For the most part, no. But new Windows 7 SMB systems are coming, said Sandrine Skinner, a Director in the Windows Commercial Product Management unit. Microsoft is working with PC makers including Dell, HP and Lenovo on them. (She wouldn’t share any additional hints about what’s coming, other than to say “stay tuned”). Microsoft also is working on a set of guidelines to help SMB customers choose the right version of Windows 7 for their needs, and hopes to have those guidelines out before the middle of 2010, she said.
(The lag time between Windows 7’s general-availability date of October 22 and the availability of these new SMB PCs and guidelines isn’t a deal breaker. Many business users have said they aren’t planning to move to Windows 7 until some time in 2010 or 2011. But I’d argue Microsoft and its partners need to come to market with these sooner rather than later, if they want to ride the current consumer wave of interest in Windows 7.)
For many SMB users, Windows 7 Professional is likely to be the right fit, Skinner said. Professional is a SKU which Microsoft hasn’t done as much to evangelize as it has Home Premium and Enterprise. (Enterprise is for volume licensees with Software Assurance contracts only.) Professional costs $199 (for an upgrade from a previous version of Windows) and $299 for a new, full retail version. (Home Premium goes for $119 and $199, comparatively.)
Windows 7 Professional includes some features that Home Premium doesn’t, such as location-aware printing, domain join, encrypted file system, and remote-desktop connectivity. It doesn’t include BitLocker/BitLocker To Go encyrption, DirectAccess (VPN replacement technology), BranchCache, AppLocker and the ability to boot from VHD (all of which are Enterprise and Ultimate features).
Professional also includes XP Mode, the virtualization capability that allows users to run XP applications that won’t work natively on a Windows 7 machine — especially custom, line-of-business apps — inside of a virtualized XP environment. (XP Mode is supported on Professional, Enterprise and Ultimate, but no other Windows 7 SKUs). The development of XP Mode and integration of it into Windows 7 came from direct feedback from Microsoft’s partners and SMB customers, Skinner said.
“We had planned to offer Virtual PC. But they (partners and customers) pushed us to make this less technically complex,” Skinner said.
Is Pro the right choice for every small business user? No, Skinner admitted. Someone like me — who doesn’t have a server and can simply back up to a SkyDrive — probably doesn’t need all the features of Windows 7 professional. In fact, I didn’t buy a Windows 7 Professional PC; I ended up buying an ASUS thin-and-light UL30A 13-inch system in silver (rather than the usual black) running Windows 7 Home Premium. It’s been great so far, though I wish they had offered a matte-screen option.
Skinner said to expect Microsoft to offer “a set of recommendations for a ‘business PC’” some time in the first half of next year. She also said to watch for Microsoft to do more Windows 7 advertising to the SMB segment of the market. She also said Microsoft is going to be expanding its “Ignite” early adopter/tester program for Windows SMB partners and customers, going forward. For Windows 7, there were only 130 or so Ignite testers in 30 countries, who offered Microsoft direct feedback on Windows 7, starting with the Beta release of the product in January 2009.
By Mary Jo has covered the tech industry for more than 20 years.
5 Reasons to Refresh your PC in 2010
New PCs with Windows 7 Professional feature things like Quicklook 3 and Quickweb, making it possible to access vital information on your desktop or the web in a matter of seconds, without having to boot up.
Get more done
A new computer desktop with an Intel® Core™2 Duo processor & Windows 7 Professional can run nearly 3 times faster than your old PC while using half the electricity. Also, compared to an older laptop with Windows XP, a new HP business laptop with Windows 7 can deliver up to 68% increase in performance power.
A new computer desktop can consume up to 55% less energy than an older desktop.
PCs older than three years can experience much more downtime than new ones. With new PC’s, you can experience much more uptime, which in turn means more productive work-time and more profitability for your business.
Improve your return on technology
Gain maximum value from new computers. VNet Professionals can access over $1,000 in FREE offers with every hardware purchase while services like HP’s Trade-In Program that gives you cash back for your old PC’s – up to $100 when you trade-in and upgrade your notebook and financing options are always available make transitioning to new PCs easy and affordable.
Windows 7 compatibility problems? Microsoft might have an app (or service) for that
Microsoft is continuing to emphasize its “businesses should upgrade sooner rather than later” message with Windows 7 — and is using both carrots and sticks to push them to do so.
The latest attempt to convince customers comes in the form of take-aways Microsoft officials have uncovered and are sharing publicly from some of the early Windows 7 enterprise deployments. Norm Judah, the Chief Technology Officer of Microsoft Services (the group that encompasses Microsoft Consulting Services, consumer support and commercial support) discussed some of these learnings and offered advice during an interview I had with him on December 7.
While it’s pushing businesses to kick off deployment now, Microsoft isn’t suggesting enterprise users rush into things; in fact, Microsoft has been an advocate of measured, 12-month-plus evaluation, assessment, compatibility testing, deployment and training period.
“The assessment of compatibility is turning out to be the most interesting part” of the Windows 7 deployment process, said Judah, whose team is helping shepherd a number of companies through the process. “In some cases, the remedies (for compatibility problems) are fairly simple,” he said. (Microsoft provided, via a press release, an example of an unnamed European petrochemical company which was able to fix Windows 7 compatibility problems with more than 1,000 custom apps written in Visual Basic by changing a library module that was common to all of those apps.)
“There’s also the question as to whether customers really need an (incompatible) application,” Judah said. When performing an evaluation, customers have a chance to figure out which apps are worth taking the trouble to try to fix vs. which can be “discarded,” he said. Judah cited as an example of an app that might be discardable as Lotus Notes… And no, I’m not kidding.
(Maybe if Microsoft is throwing in a free copy of Exchange plus offering to do all the migration work from Notes to Exchange. Otherwise, I’d tend to think Notes might fall more into the “mission critical” than the “who cares” department.)
Microsoft is continuing to flood businesses with Windows 7 deployment tools. Some of these are paid and offered to volume licensees only (like the Microsoft Desktop Optimization Pack set of tools.) Others are free frameworks, reference architectures, white papers and other kinds of aids. There’s a set of Desktop Deployment Planning services (another Software Asurance licensee-only offering) and Desktop Deployment Jumpstart. Judah’s team will perform on-site handholding and consulting to companies who want more help. He said Microsoft Services has helped some early deployers develop and run scripts that help secure and harden Windows 7 desktops.
There’s another part to what Judah’s team does: He and his troops take the learnings from the field back to Microsoft product groups to help them figure out what’s working and isn’t as they continue to provide service and support — as well as begin to work on service packs and the next versions of new products. At the same time, via a new program launched this year, Microsoft Services is packaging up the intellectual property it is acquiring while working on these kinds of deployments and providing it to Microsoft’s reseller partners via the “Services Ready” program.
If all those carrots aren’t enough to convince companies to move to Windows 7, Microsoft also is continuing to wield the stick. On Twitter, I’ve seen the OEM System Builder team tweet more than once over the past couple of days about the pending expiration of support for Windows XP SP2 and the Vista RTM (release-to-manufacturing version, not Vista SP1 or SP2). End of paid, extended support for XP SP2 is slated for July 13, 2010. End of free, mainstream support for Vista ends the same day. Consider yourself warned.
By Mary Jo has covered the tech industry for more than 20 years.
Guide To Choosing An Honest, Reliable, and Competent Computer Repair Technician
Don’t Trust Your Computer or the Irreplaceable Files On It To Just Anyone!
Hiring the wrong computer repair guy can not only be incredibly frustrating and expensive, but you could end up losing ALL of your irreplaceable files, photos, music, e-mails, and other important documents!
Read this guide and you’ll discover:
•Computer scams and rip-offs that you MUST be aware of.
•5 Costly misconceptions about computer maintenance and repair.
•Viruses, worms, spyware, and hackers: what you need to know to protect yourself.
•7 Questions you need to ask before buying any computer equipment.
•5 Critical characteristics you should demand from your computer repair technician.
•Why you need to avoid “cheap” or “bargain” computer repair shops.
•The one surefire sign that you should run – not walk – out of a computer repair shop.
The 10 best IT certifications
IT certifications boast numerous benefits. They bolster resumes, encourage higher salaries, and assist in job retention. But which IT certifications are best?
Technology professionals generate much debate over just that question. Many claim vendor-specific programs best measure a candidate’s skills, while others propose vendor-independent exams are the only worthy way of measuring real-world expertise. Still other observers believe the highest-level accreditations — Microsoft’s MCSE or new Architect Series certification, Cisco’s CCIE, etc. — are the only credentials that truly hold value.
Myself, I don’t fully subscribe to any of those mindsets. The best IT certification for you, after all, is likely to be different from that for another technology professional with different education, skills, and goals working at a different company in a different industry. For that reason, when pursuing any professional accreditation, you should give much thought and care to your education, experience, skills, goals, and desired career path.
Once a career road map is in place, selecting a potential certification path becomes much easier. And that’s where this list of the industry’s 10 best IT certifications comes into play. While this list may not include the 10 best accreditations for you, it does catalog 10 IT certifications that possess significant value for a wide range of technology professionals.
Note: This information is also available as a PDF download.
The new-generation Microsoft Certified IT Professional credential, or MCITP for short, is likely to become the next big Microsoft certification. Available for a variety of fields of expertise — including database developer, database administrator, enterprise messaging administrator, and server administrator — an MCITP validates a professional’s proven job-role capabilities. Candidates must pass several Microsoft exams that track directly to their job role before earning the new designation.
As with Microsoft’s other new-generation accreditations, the MCITP certification will retire when Microsoft suspends mainstream support for the platforms targeted within the MCITP exams. By matching the new certification to popular job roles, as has been done to some extent with CompTIA’s Server+ (server administrator), Project+ (project manager), and A+ (desktop support) certifications, Microsoft has created a new certification that’s certain to prove timely, relevant, and valuable.
The new-generation Microsoft Certified Technology Specialist (MCTS) helps IT staff validate skills in installing, maintaining, and troubleshooting a specific Microsoft technology. The MCTS certifications are designed to communicate the skills and expertise a holder possesses on a specific platform.
For example, candidates won’t earn an MCTS on SQL Server 2008. Instead, they’ll earn an MCTS covering SQL Server business intelligence (MCTS: SQL Server 2008 Business Intelligence), database creation (MCTS: SQL Server 2008, Database Development), or SQL server administration (MCTS: SQL Server 2008, Implementation and Maintenance).
These new certifications require passing multiple, tightly targeted exams that focus on specific responsibilities on specific platforms. MCTS designations will expire when Microsoft suspends mainstream support for the corresponding platform. These changes, as with other new-generation Microsoft certifications, add value to the accreditation.
Security continues to be a critical topic. That’s not going to change. In fact, its importance is only going to grow. One of the quickest ways to lose shareholder value, client confidence, and sales is to suffer a data breach. And no self-respecting technology professional wants to be responsible for such a breach.
CompTIA’s Security+ accreditation provides a respected, vendor-neutral foundation for industry staff (with at least two years of experience) seeking to demonstrate proficiency with security fundamentals. While the Security+ accreditation consists of just a single exam, it could be argued that any IT employee charged with managing client data or other sensitive information should, at a minimum, possess this accreditation. The importance of ensuring staff are properly educated as to systems security, network infrastructure, access control, auditing, and organizational security principles is simply too important to take for granted.
There’s more to information technology than just administration, support, and networking. Someone must create and maintain the applications and programs that power organizations. That’s where the new-generation Microsoft Certified Professional Developer (MCPD) credential comes into play.
The MCPD accreditation measures a developer’s ability to build and maintain software solutions using Visual Studio 2008 and Microsoft .NET Framework 3.5. Split into three certification paths (Windows Developer 3.5, ASP.NET Developer 3.5, and Enterprise Applications Developer 3.5), the credential targets IT professionals tasked with designing, optimizing, and operating those Microsoft technologies to fulfill business needs.
A redesigned certification aimed at better-measuring real-world skills and expertise, the MCPD will prove important for developers and programmers. Besides requiring candidates to pass several exams, the MCPD certification will retire when Microsoft suspends mainstream support for the corresponding platform. The change is designed to ensure the MCPD certification remains relevant, which is certain to further increase its value.
The Cisco Certified Internetwork Expert (CCIE) accreditation captures most of the networking company’s certification glory. But the Cisco Certified Network Associate (CCNA) might prove more realistic within many organizations.
In a world in which Microsoft and Linux administrators are also often expected to be networking experts, many companies don’t have the budgets necessary to train (or employ) a CCIE. But even small and midsize corporations can benefit from having their technology professionals earn basic proficiency administering Cisco equipment, as demonstrated by earning a CCNA accreditation.
As smaller companies become increasingly dependent upon remote access technologies, basic Cisco systems skills are bound to become more important. Although many smaller organizations will never have the complexity or workload necessary to keep a CCIE busy, Cisco’s CCNA is a strong accreditation for technology professionals with a few years’ experience seeking to grow and improve their networking skills.
Technology professionals with solid hardware and support skills are becoming tougher to find. There’s not much glory in digging elbow-deep into a desktop box or troubleshooting Windows boot errors. But those skills are essential to keeping companies running.
Adding CompTIA’s A+ certification to a resume tells hiring managers and department heads that you have proven support expertise. Whether an organization requires desktop installation, problem diagnosis, preventive maintenance, or computer or network error troubleshooting, many organizations have found A+-certified technicians to be more productive than their noncertified counterparts.
Changes to the A+ certification, which requires passing multiple exams, are aimed at keeping the popular credential relevant. Basic prerequisite requirements are now followed by testing that covers specific fields of expertise (such as IT, remote support, or depot technician). The accreditation is aimed at those working in desktop support, on help desks, and in the field, and while many of these staffers are new to the industry, the importance of an A+ certification should not be overlooked.
Some accreditations gain value by targeting specific skills and expertise. The Project Management Professional (PMP) certification is a great example.
The Project Management Institute (PMI), a nonprofit organization that serves as a leading membership association for project management practitioners, maintains the PMP exam. The certification measures a candidate’s project management expertise by validating skills and knowledge required to plan, execute, budget, and lead a technology project. Eligible candidates must have five years of project management experience or three years of project management experience and 35 hours of related education.
As organizations battle tough economic conditions, having proven project scheduling, budgeting, and management skills will only grow in importance. The PMI’s PMP credential is a perfect conduit for demonstrating that expertise on a resume.
Even years after their introduction, Microsoft Certified Systems Engineer (MCSE) and Microsoft Certified Systems Administrator (MCSA) credentials remain valuable. But it’s important to avoid interpreting these accreditations as meaning the holders are all-knowing gurus, as that’s usually untrue.
In my mind, the MCSE and MCSA hold value because they demonstrate the holder’s capacity to complete a long and comprehensive education, training, and certification program requiring intensive study. Further, these certifications validate a wide range of relevant expertise (from client and server administration to security issues) on specific, widely used platforms.
Also important is the fact that these certifications tend to indicate holders have been working within the technology field for a long time. There’s no substitute for actual hands-on experience. Many MCSEs and MCSAs hold their certifications on Windows 2000 or Windows Server 2003 platforms, meaning they’ve been working within the industry for many years. While these certifications will be replaced by Microsoft’s new-generation credentials, they remain an important measure of foundational skills on Windows platforms.
As mentioned with the Security+ accreditation earlier, security is only going to grow in importance. Whatever an organization’s mission, product, or service, security is paramount.
(ISC)², which administers the Certified Information Systems Security Professional (CISSP) accreditation, has done well building a respected, vendor-neutral security certification. Designed for industry pros with at least five years of full-time experience, and accredited by the American National Standards Institute (ANSI), the CISSP is internationally recognized for validating a candidate’s expertise with operations and network and physical security, as well as their ability to manage risk and understand legal compliance responsibilities and other security-related elements.
While pursuing my first Microsoft certification 10 years ago, I remember debating the importance of Linux with several telecommunications technicians. They mocked the investment I was making in learning Microsoft technologies. These techs were confident Linux was going to displace Windows.
Well, didn’t happen. Linux continues to make inroads, though. The open source alternative is an important platform. Those professionals who have Linux expertise and want to formalize that skill set will do well adding CompTIA’s Linux+ certification to their resumes.
The vendor-neutral exam, which validates basic Linux client and server skills, is designed for professionals with at least six to 12 months of hands-on Linux experience. In addition to being vendor-neutral, the exam is also distribution neutral (meaning the skills it covers work well whether a candidate is administering Red Hat, SUSE, or Ubuntu systems).
Let the debate begin
Technology professionals almost always have strong reactions when debating certification’s value. Listing the top 10 certifications leaves room, of course, for only 10 credentials. That means many favorite and popular designations, such as HIPAA and Sarbanes-Oxley (SOX) certifications, have been necessarily omitted. Other important accreditations, including those for VoIP providers and from PC manufacturers, Red Hat, and even Apple, have also been left out here.
By:Erik Eckel is president of two privately held technology consulting companies.
5 security threats to watch in 2010
Everyday Internet users will be a key target for cybercriminals looking to get people to download their malware, while the proliferation of social sites such as Facebook and Twitter will lead to an increase of possible fraud cases, reported Symantec.
At a media briefing Wednesday, the security vendor released a report outlining security threats enterprises and consumers should be mindful of in 2010. Of these, the security risk faced by everyday Internet users is likely to increase as criminals look to trick people into downloading malware through means such as an innocent-looking URL link or videos and pictures from unknown sources.
“[Users] could be opening themselves up to identity theft and other types of cybercrime,” Symantec said in the report, adding that the number of attempted attacks using social engineering “is sure to increase” next year.
Also, as the popularity of Apple products continue to grow, Mac and iPhone users—two of the most popular products by Apple—should look to protect the content they place on their devices as “more attackers will devote time to create malware to exploit these devices”, according to the report. With the increased use of smartphones, mobile security will also be an area of concern, added Symantec.
Patch time for FreeBSD users as Zero-Day exploit is published
If you are a FreeBSD user then it’s patch time as a new exploit is published which gives attackers root access to machines.
The flaw affects versions 8.0 and 7.1 of FreeBSD.
The researcher, Kingcope, has posted an explanation of the flaw on the Full Disclosure mailing list:
The bug resides in the Run-Time Link-Editor (rtld). Normally rtld does not allow dangerous environment variables like LD_PRELOAD to be set when executing setugid binaries like “ping” or “su”. With a rather simple technique rtld can be tricked into accepting LD variables even on setugid binaries. See the attached exploit for details.
If that doesn’t make any sense to you (and I don’t blame you if it doesn’t), don’t worry, a patch has been published. Interestingly however, Colin Percival, the project’s security officer, felt that because of the severity of the flaw and the fact that exploit code exists, it was necessary to post the patch as soon as possible, without even publishing a security advisory:
“A short time ago a ‘local root’ exploit was posted to the full-disclosure mailing list; as the name suggests, this allows a local user to execute arbitrary code as root. Normally it is the policy of the FreeBSD Security Team to not publicly discuss security issues until an advisory is ready, but in this case since exploit code is already widely available I want to make a patch available ASAP. Due to the short timeline, it is possible that this patch will not be the final version which is provided when an advisory is sent out; it is even possible (although highly doubtful) that this patch does not fully fix the issue or introduces new issues — in short, use at your own risk (even more than usual).”
It’s also worth pointing out that this is a local exploit, not one that an attacker can exploit remotely.
Ruby on Rails becomes latest open-source offering to run on Microsoft’s Azure cloud
For a while now, Microsoft has been courting open-source software makers to convince them of the wisdom of offering their wares on Windows. So it’s not too surprising that many of those same apps also are being moved to the Windows Azure cloud platform.
At the end of November, Microsoft architect Simon Davies blogged that he had gotten the open-source Ruby on Rails framework to run on Windows Azure. By using a combination of new functionality in the November Windows Azure software development kit (SDK), plus some new Solution Accelerator technology, Davies said he managed to get Ruby on Rails to run. (The fruits of Davies’ labors are available at ..rubyonrails.cloudapp.net/.)
“One of these (new November SDK) features enables Worker Roles to receive network traffic from both external and internal endpoints using HTTP, HTTPS and TCP. This new feature enables many new scenarios, one of then is the ability to run existing applications that receive traffic over sockets in Windows Azure.”
There are a bunch of these Azure Solution Accelerators available for download from the Windows Azure Platform Web site. There are also new SDKs for Microsoft’s recently unveiled AppFabric middleware for Java, Ruby and PHP developers, as well, availble for download.
Davies noted that Microsoft has demonstrated a number of open-source apps, including MySQL, Mediawiki, Memcached and Tomcat, can run on Windows Azure. Microsoft has been working on delivering PHP and Eclipse tools for Windows Azure.
Recently, CNet open-source blogger Matt Asay expressed some concern that Microsoft’s “super-friendly, super-dangerous bear hug” of open-source applications — especially in the cloud realm — could do open-source more harm than good.
Some open-source vendors — SugarCRM comes to mind — have developed their own Azure ports of their wares. But in other cases, Microsoft is the instigator, either moving the open-source applications and tools onto Azure or working with a third-party to do so.
I don’t see the same kind of potential danger that Asay does in this scenario, since what really matters is whether developers and customers are interested in using what’s hosted on Azure, rather than who “put” the apps in the cloud. Do you agree?
By:Mary Jo has covered the tech industry for more than 20 years
Office Web Apps access comes to Windows Mobile, iPhone, Blackberry and more (with some caveats)
For the past year, Microsoft officials have said repeatedly — without offering any specifics — that Office Web Apps will work on mobile phones. But now that the public beta of Office Web Apps is available (as of mid-November), the Redmondians are revealing, with a little prodding. a bit more about the company’s mobile Office Web Apps plans.
First, a quick review of what Microsoft has promised regarding Office 2010 on mobile phones. Company officials have been saying for months that Microsoft is planning to offer two different ways for phone users to get access to Office 2010: Via Office Web Apps (the Webified versions of Word, Excel, PowerPoint and OneNote) and via Office Mobile 2010, the phone-centric version of Office. Both Office Web Apps and Office Mobile 2010 are available as free, downloadable betas by anyone interested in trying them out. (It bears repeating that the Office Web Apps version that went to public beta in mid-November is not the free, consumer version of Office Web Apps; it’s the beta of the business version that will be available as a paid offering and requires SharePoint.)
The public beta of Office Web Apps does support mobile access, Microsoft officials said. But Which phones and which browsers? Here’s the list:
IE on Windows Mobile 5/6/6.1/6.5
Safari4 on iPhone 3G/S
BlackBerry 4.x and newer versions
NetFront 3.4, 3.5 and newer versions
Opera Mobile 8.65 and newer versions
Openwave 6.2, 7.0 and newer versions
“Support,” in the case of Office Web Apps, means viewing only of Word, Excel, PowerPoint and OneNote documents. (No OneNote viewing is part of this beta, a Microsoft spokesperson reminded me on December 3.) You cannot create or edit these documents from your phone. That’s true now (as of the public beta) and will be true when the final versions of Office Web Apps are available by June 2010, Microsoft officials confirmed to me yesterday.
Office Mobile 2010 enables editing and viewing of Word, Excel, PowerPoint, OneNote and SharePoint documents on phones running the Windows Mobile 6.5 operating system.
Microsoft is planning to provide the ability to view Word, Excel, PowerPoint and OneNote documents on phones to customers who opt for its free Office Web Apps suite, too. But so far, it is still working out the details as to how that will work, as customers of the free version access Office Web Apps via Windows Live SkyDrive and not SharePoint, a spokesperson told me. The free version of Office Web Apps is slated to launch ahead of the business versions — some time this spring, alongside the (beta or final — not sure which) of the Windows Live Wave 4 suite of services — Microsoft execs told me recently.
If you’re wondering why you need SharePoint to view Office documents from your phone, here’s what the spokesperson said:
“You don’t need the SharePoint Workspace on your mobile phone since you are accessing the docs through the browser, but you do need SharePoint on the back end. Technically speaking, the ‘doc library’ needs to detect what kind of browser it’s talking to, and then send the document to be rendered in ‘mobile Office Web App viewer mode’ of the Office Web App when the end user opens the doc. SharePoint Doc Library handles this detection and properly hands off of the doc to the Office Web App/ browser for rendering on mobile phones.”
By: Mary Jo has covered the tech industry for more than 20 years.
Netbooks dead? Not when sales are up 264 percent
Can the best-selling category of the PC market really be just a fad? A junky joke? A stunt to prop up the PC market created by Intel?
Jason Hiner at TechRepublic seems to think so. He proclaims:
Netbooks — those underpowered mini laptops with 7-inch screens and unusable little keyboards — are a dying fad. However, the legacy of the netbook will be that inexpensive notebook computers are here to stay, and they are lighter and thinner than ever.
Analysts and pundits will continue to use the term “netbook” but I’m going to argue that the device that we originally called the netbook is being phased out — and thankfully so.
I have a netbook. It’s small—9 inches—and it now belongs to my daughter. My hands are too big. The screen is too cramped. And I’m inclined to think that Jason’s right. The netbook is just a passing fancy.
And then I follow the numbers. Look at all the people buying netbooks. NPD’s DisplaySearch reckons that netbook sales surged 264 percent in the second quarter from a year ago. Revenue for the overall notebook market declined. Here’s the scorecard.
Meanwhile, check out Jason’s talkbacks. It’s a love affair—and they all couldn’t be sent by the netbook fan club.
The special thing about it that makes me happy is that it’s small and so handy. I don’t need to play games or do lots of complicated things on the street. But this one is just 100% what I need and I will never give it up.
I bought a Dell Mini 9 in 2008 and have never regretted it. It’s small enough to carry in my purse, boots up quick, and maybe it’s because I have small fingers, but the size of the keyboard has never been an issue.
That said, it is not my main PC, nor would I ever try to make it such. I bought it to browse the internet and do some light word processing - the heaviest lifting I have ever asked it to do is stream movies across my wireless home network - and it has always performed flawlessly.
I bought mine due to travel restrictions imposed by the airlines on a trip to Australia in 2008 and love it. I use a regular laptop/notebook as my main computer at home but it is too big and heavy to travel with. The Netbook allows me to use almost all my programs, some engineering, spreadsheets, topographic maps and GPS routings. I even use it at home with my wireless network, sometimes in bed at night while reading books on exploring Utah so I can see the topographic maps and the satellite pictures of the area. No it doesn’t replace the desktop notebook but darn near.
Are these people bonkers? Nope. Intel’s financial results—partially fueled by the Atom chip that powers these little devices—tell the tale.
Also see: All netbook reviews
Netbooks aren’t for me, but apparently there are a ton of allegedly confused consumers still buying them. Dell and Microsoft have downplayed the netbook to some degree, but what else are they going to do? After all, the netbook is a margin killer.
So what’s the future of the netbook? It’s way too predictable to envision lightweight notebooks replacing the netbooks. Netbook 2.0, 3.0 and 4.0 are likely to have different form factors. Perhaps the Droid and the iPhone are really your netbooks. Perhaps Apple redefines the netbook category with a tablet. Perhaps people keep buying the current versions of netbooks. Netbooks will hang around and probably thrive because people like second and third computing devices. The form factor may change, but the market niche isn’t going anywhere.
Microsoft still working on an Adobe Lightroom competitor, but with a social twist
It’s been almost two years since I first got tips about Microsoft “SmartFlow,” a product which allegedly was going to be a competitor with Adobe’s Photoshop Lightroom post-production software for professional photographers. I had thought that incubation project may have been quietly eliminated somewhere along the way.
However, during an interview I had with Microsoft Chief Software Architect Ray Ozzie this week at Microsoft’s Professional Developers Conference, I discovered work is going foward on SmartFlow — but in a new part of the company and with a new twist.
SmartFlow is now one of the projects under the recently-created Microsoft FUSE social-computing lab, Ozzie said. The 82-person Future Social Experiences (FUSE) Labs will be headed by General Manager Lili Cheng. FUSE is an amalgamation of Cheng’s Microsoft Research (MSR) Creative Systems group and two other labs that are already under Ozzie: Rich Media Labs, in Redmond, Wash., and Starup Labs, based in Cambridge, Mass.
“Cheng’s got — it wasn’t really written about a lot, but there was a project under (former Chief Technical Officer) David Vaskevitch called SmartFlow,” Ozzie told me. The FUSE Lab is bringing together people who are really great about the communications aspect of social (networking) and the media aspects. And so I’m really excited to see some of the ideas that they have in the realm of using photos, videos, and communications kind of brought together.”
After spending quite a bit of time behind the scenes with the Windows Azure team, helping that group to coalesce, Ozzie is now dedicating more of his time to other projects at the company, especially FUSE, he said this week.
SmartFlow “was heading toward Lightroom, and then we realized from the perspective of the direction of where it was going … that there’s more excitement about what people are doing,” Ozzie elaborated. “Photography has been transformed by what people are doing with camera phones a lot more than the high-end phones. I mean, I have my DSLR kinds of things, but I just think what every may is doing with photos and using it in the context of the communications is a lot more interesting and video is quite untapped, I think at this point.”
Like other Microsoft Labs, such as Live Labs, Office Labs and Ad Labs, there’s no promise that any of the incubations upon which Cheng and her team members are working will necessarily result in commercialized products. Ozzie didn’t offer up more specifics or a timetable as to when SmartFlow may be available to the public in test or final form. But once the cover is raised on SmartFlow, it will be interesting to see what social networking will bring to photo editing.
(A related aside: Vaskevitch, the former Microsoft CTO with the company’s Server and Tools group, quietly left Microsoft in September, I realized only today when searching for his title for this post. Vaskevitch had been with Microsoft since 1986 and had held a variety of marketing and strategy positions at the company.)
Older PCs Costing You More Than You Think: The Financial Rewards of PC Refresh
Many IT departments have postponed their PC refresh in the belief that delaying spending is the right thing to do. But older PCs can cost more to support, make workers less productive, can cause more security breaches, use more energy and are seldom under warranty.
There are strong financial reasons to refresh PCs now and, with the Windows 7 operating system, there are even stronger reasons to do so with PCs powered by Intel® Core™ 2 processors with vPro™ technology.
Give us a call (858) 633-1800 and we can provide you with more information of why we recommend you upgrade.
VMware Announces VMware View 4
Earlier this week VMware announced VMware View 4, the latest version of its desktop virtualization product.
The release includes enhanced PC over IP (PCoIP) display protocol technology licensed from Teradici that both vendors claim is optimized for virtual desktop environments.
Scott Davis, VMware’s chief technology officer for VMware View, explained the benefits of PCoIP in a recent blog post. "VMware has been diligently working with Teradici to create a virtualized implementation of this robust, innovative protocol and deliver the premier remote desktop experience for VMware View. PCoIP is a server-centric protocol, meaning that we are doing the majority of the graphics rendering and processing on powerful servers," wrote Davis. "Compressed bitmaps or frames are transmitted to the remote client. This division of labor has some ideal properties for static content."
"[It makes] use of the powerful processing capabilities of multi-core servers such as Intel’s Nehalem to render the graphics," he continued. "More importantly, by transmitting compressed bitmaps or frames, we can adjust the protocol in real time to account for the available bandwidth and latency of the communications channel."
VMware View 4 also introduces new features that make it easier for administrators to manage and provision virtual desktops. "With View 4, VMware is really ratcheting up the sophistication of the data center portion of desktop management," said Charles King, principal analyst for IT industry analysis firm Pund-IT. "VMware is trying to make the provisioning and management of virtual machines easier and more seamless than it has been in the past with many other competing solutions."
VMware has been locked in a battle with Microsoft and Citrix over the enterprise virtualization market. VMware is still widely seen as the leader in that fast-growing market segment, and company officials hope that the release of VMware View 4 will help the company maintain that position.
VMware View 4 will be available on November 19 and will be offered in two editions: VMware View 4 Enterprise Edition will be priced at $150 per user, and includes VMware vSphere 4, VMware View Manager 4 and VMware vCenter 4; VMware View 4 Premier Edition will cost $250 per user and includes VMware vSphere 4, VMware View Manager 4, VMware ThinApp 4, VMware vCenter 4 and VMware View Composer.
A free trial is available for download here.
TOP 4 Reasons to outsource your IT Department
VNet Professionals offers an unlimited monthly IT support contract. At no additional cost to you, your business can call us whenever you need help 24/7. In doing so, we become your dedicated IT department for your staff and business alike.
The average annual salary of an in-house IT Manager can be costly. Not to mention the cost of recruitment, company benefits and required training to keep skills sets up to date. This is an overwhelming situation for a small to medium business. IT downtime frustrates the operation of any business by having your staff spending valuable hours trying to solve problems they do not completely understand.
2) Lack of in-house expertise
Your IT systems are invaluable to your business. We understand you simply can't justify spending money on a full-time in-house employee. To accommodate your business needs, VNet Professionals Inc. offers an unlimited monthly IT support contract. At no additional cost to you, your staff can call us whenever you need help. In doing so, we become your dedicated IT department for your staff. In other words you receive the benefit from a department and not from just one or two people.
3) Extend your in-house expertise
Recruitment is a tough, time-consuming and expensive process. If you’re not from a technical background, how do you recruit the right company? At VNetPros., we spend countless hours training on the latest technologies by certification and researching best of breed solutions so you don’t have to. By outsourcing your IT needs to VNetPros, you receive the peace of mind knowing your getting a dedicated team of highly-trained, industry leading experts to do the job right the first time, every time. We engineer success!
Research shows that keeping the responsibility for technology in-house dramatically reduces business continuity and office productivity by up to 56%. Technology transforms at a rapid rate and businesses that change with it stay competitive. How will your business learn about new technologies before your competitors do? Simple, by building a relationship with VNetPros, you have access to people in the trenches. We live and breathe technology. Our knowledge helps you gain advance business intelligence on leading technology in any industry.
ActiveBatch Job Scheduler Adds Integrated VMware Support
Advanced Systems Concepts, Inc. (ASCI), maker of ActiveBatch® workload automation and job scheduling software and other solutions that enhance Windows™, Linux, UNIX, z/OS and OpenVMS systems among others, today announced the availability of a new VMware Extension for ActiveBatch . For the first time, users and developers have the ability to easily deploy and maintain virtual systems within ActiveBatch while also integrating those systems with other applications, databases and platforms inside and/or outside their VMware environment.
ActiveBatch’s new VMware support is made possible through a new set of production-ready job steps in the application’s integrated jobs library. These job steps eliminate the need for command line use and/or expensive scripting, replacing those actions with a convenient drag-and-drop interface.
“With VMware support, ActiveBatch users can achieve a dramatic new level of resource utilization both within and outside the VMware infrastructure,” said Jim Manias, Vice President of Marketing and Sales for Advanced Systems Concepts. “By combining ActiveBatch with VMware, for example, you can create and/or power on a virtual machine, submit the workflow, and power off the virtual machine when the workflow has successfully completed, enabling the hardware and software involved to be made available for other purposes. The result is reliable, unattended execution of jobs in a fashion that improves service levels and maximizes the efficient use of computing resources.”
The ActiveBatch VMware library includes support for the complete set of VMware events as well as vCenter and vSphere. VMware events can trigger ActiveBatch plans and/or actions based on events such as Power Up, Power Down or Reboot. It can also initiate jobs based on any of the event classes that are already supported by ActiveBatch including File Trigger, WMI, Email, MSMQ, Web Services and many others.
ActiveBatch, one of the most widely used IT job scheduling and management applications with deployments in 36 countries, adds strategic business value by automating operations in real-time for improved IT operations throughout an organization. The software allows users to automate and centrally manage jobs and workflows across multiple and disparate operating systems, computing platforms, applications and databases. ActiveBatch V7, the application’s newest version, offers a dynamic Service Oriented Architecture (SOA) and related Web Services tools to support internal and external Web Services as part of scheduled workflows.
ActiveBatch VMware integration from Advanced Systems Concepts, a VMware Technology Alliance Partner (TAP), is separately licensed as an extension of the ActiveBatch system. For more information about ActiveBatch, including white papers and a free copy of a recent Advanced Systems Concepts, Inc.-commissioned study on the economic impact of ActiveBatch, log on to www.advsyscon.com.
Published Tuesday, November 10, 2009 6:42 PM by David Marshall
What Windows7 could mean for Linux
I’ve had people using Windows 7 for about three months now, and everything about it so far seems to confirm my first impression that it’s a lot better than Vista: effectively reprising the consolidation and debugging Windows 98 offered over 95.
Once you get past the sheer shock of using a Microsoft OS that doesn’t fail daily, however, you start to fret about the things that aren’t there: as a Mac/Solaris user, for example, I find the absence of multi-screen capabilities and the relative inflexibility of working panes and icons extremely frustrating. Still it is usable; and that’s a long step forward - at least until you get to development work.
Then the frustrations set in: Visual Studio is very slick, but very limited. Specifically, it’s great if your application is going to use a super-computer desktop as a graphics terminal but pretty much counter-productive if you want to sidestep client-server and produce genuinely integrated multi-host applications.
So why? Well, mainly because Microsoft’s inability to transcend its own 90s focus on helping its sales force make money selling client-server into businesses has left the whole .net thing Microsoft promised to integrate into Longhorn and its successors implemented, along with the promised PICK-like file system and security conscious display frameworks, only in marketing documentation.
Organizational disfunction aside, I think the key technical reason for this has been that getting those things done within the underlying memory and process management paradigm Windows NT+ inherited from VMS has proven, if not actually impossible, at least too hard for Microsoft to make a commercial success of.
So now it wants to sell cloud computing and applications rentals but doesn’t have the OS foundation on which the development of these products has to rest - and that’s going to force Microsoft into a build or buy decision.
They’ve been trying to build a network based, vaguely Unix like, OS for PowerPC for about six years now -with no success to speak of, so my guess is that the build exponents will eventually lose the argument - leaving Microsoft with three mutually exclusive choices:
1.get there through a licensing deal with Apple;
2.do it by adopting and extending OpenBSD; or,
3.do it by adopting and extending Linux.
Each approach has pluses and minuses: the Apple approach would cost the most upfront, but drop a leading competitor out of Microsoft’s desktop markets; the OpenBSD approach combines low cost with a high quality code base and a well deserved reputation for security; and the Linux approach capitalizes on the breadth and capabilities of its community while threatening IBM.
You’d think Microsoft could do the Apple deal at the drop of a phone call to Mr. Jobs - who clearly wants to be out of the traditional PC business anyway - but my guess is that the emotional barriers to rational behavior on this will prevent that phone call.
If it comes to shootout between the OpenBSD and Linux options I suspect Microsoft’s techies will line up favoring OpenBSD as offering the stronger foundation for all the neat stuff they dream of doing, while all the marketing types will favor Linux - and in that company marketing trumps technology every time.
So the bottom line for Linux on Windows may be simple: Windows7 is probably Microsoft’s best OS yet and will therefore slow the move the Linux in the short term, but the limitations built into Microsoft’s development stack show it to be a dead end that will leave Microsoft marketing magnificent visions of its unfolding future while quietly figuring out how and when to abandon that code base for something else - and because that something could very logically be Linux it might be time for the Linux community to start paying a lot more attention to legacy interoperability with Windows.
By Paul Murphy (a pseudonym) is an IT consultant specializing in Unix and related technologies.
Exchange 2010 Official Release
Microsoft officially released Exchange Server 2010 today. As an MSDN and TechNet subscriber, I could go download the code for free and install it on my in-house Windows Server 2008 R2 box. But I have no plans to download those bits or install them.
Instead, I’m planning to let someone else handle the heavy lifting for me, and I suspect I have a lot of company. The biggest objection to a complex but powerful server product like Exchange is the hassle of managing it locally. Using a third-party hosting company eliminates those hassles and adds benefits like redundant data storage and simplified administration.
For the past few years, I’ve kept all my personal and business e-mail, calendar, and contact information in an Exchange account hosted by Mailstreet, a division of Apptix. (Previously, I used unmanaged POP/SMTP servers for e-mail and stored messages, contacts, and calendar information locally in Outlook PST files.) Mailstreet’s service has been first-rate, including a recent trouble-free upgrade from Exchange 2003 to Exchange 2007. For our collaborative work on recent book projects, my co-authors and I have also been using SharePoint and Exchange 2007 as part of the Microsoft Business Productivity Online Suite, which has also been easy to use and extremely reliable.
The first third-party hosting company to cross the Exchange 2010 finish line is Intermedia, which announced availability of its hosted Exchange 2010 product (a custom-developed solution) within a few seconds of Microsoft’s announcement. So far, neither Apptix nor Microsoft’s BPOS division have announced definitive plans to make the latest version of Exchange available as a hosted offering. According to an Apptix spokesperson, being first isn’t necessarily that big of a deal:
Microsoft has not announced an official release date for the hosted version for Exchange 2010, but as a long-time member of Microsoft’s Technical Adoption Program, Apptix has been working successfully with Exchange 2010 in their lab for over a year. They will be more than ready to offer Exchange 2010 to customers once the hosted version, with the appropriate features and functionality for multitenancy incorporated, is made available - sometime next year.
Until that time, Apptix will continue to offer its proven hosted Exchange 2007 service that customers can rely on for mission-critical email communication needs. … Most end-users won’t even notice the enhancements of Exchange 2010, as the new features are primarily datacenter-centric. Microsoft’s new end-user benefits are really available in Outlook 2010, which Apptix will offer immediately to customers when it is commercially available.
I spoke last week with Intermedia’s Chief Operating Officer, Jonathan McCormick, about the company’s plans and its infrastructure. They currently boast a quarter-million users and expect the biggest source of growth in the hosted Exchange market – the “sweet spot” – to be companies with 200 to 500 users that are currently running the aging Exchange 2000 or Exchange 2003 and dread the prospect of an in-house migration.
Cost is an issue, of course, but data integrity is even more important to those business customers, McCormick told me: “They care about their data,” he said. Those Exchange repositories don’t just contain simple e-mail threads; they also include PowerPoint presentations, business contacts, and details of contracts. A server crash can paralyze the business for days, which is why Intermedia has multiple replicated platforms in data centers on opposite coasts, with rapid restore capabilities and a 100% Data Protection Guarantee.
Intermedia also touts its custom development skills, which allow them to simplify administration tasks via a custom control panel (shown here) instead of using the generic Microsoft-provided admin tools. One example is the ability to quickly perform a remote wipe of a stolen or compromised mobile device such as a Blackberry.
One misconception I had when I started investigating hosted Exchange options is that they are expensive and only appropriate for large businesses. As it turns out, most hosting companies offer plans for small companies, and both Mailstreet and Intermedia have single-user plans appropriate for sole proprietors like me. Including ActiveSync support (which works with both Windows Mobile and the iPhone) and spam filtering, I pay roughly $14 a month for a 2GB mailbox. Businesses with multiple users can get significantly lower per-user pricing.
Speaking personally, the biggest advantage of the Exchange platform for me as a small business owner is its ability to work on multiple platforms. Ironically, my recent experiments with Apple products have been especially successful with Exchange. After Mailstreet migrated my hosted account to a server running Exchange 2007, I was able to connect the Snow Leopard Mail client and an iPhone to the server and begin syncing immediately. If I send or receive a message, create or edit an appointment or contact, or trim the contents of my inbox on any PC, Mac, or mobile device (including a Windows Mobile phone), those changes are reflected on any other device. I don’t have to think about synchronization, and I don’t have to worry about a local server failure causing me to lose important data. Given how well my setup is working, I’m in no hurry to migrate to Exchange 2010, but will probably take a closer hands-on look at Intermedia’s offfering shortly.
Clearly, Google’s entry into the market (along with some very clever marketing and a halo effect from their search success) has made an impact on competitors for managed e-mail and apps, especially for small businesses. Their presence is undoubtedly responsible for Microsoft’s decision to slash its BPOS prices in half recently. It wouldn’t surprise me to see third-party hosting companies start cutting their prices as well.
By Ed Bott is an award-winning technology writer with more than two decades’ experience writing for mainstream media outlets and online publications.
Survey: Cloud Interest GROWS triple-fold; cost may not be main factor
Everyone figures that companies are buying into cloud to save money. A new survey says otherwise. But why are companies adopting cloud?
A New Study on Cloud Interest (PDF link) commissioned by Avanade shows a 320% increase over the past nine months in respondents reporting that they are testing or planning to implement cloud computing. Avanade claims this is the first time a survey has documented a global embrace of cloud computing in the enterprise.
The study also found that while companies are moving toward cloud computing, there is little support for cloud-only models (just five percent of respondents utilize only cloud computing). Rather, most companies are using a combination of cloud and internally owned systems, or hybrid approach.
Okay, the survey confirms what we’ve been seeing anecdotally. That is, there’s been a huge uptick in cloud interest. And apparently, this has been taking place during an economic downturn. But here’s where it gets interesting: Only 13% said the onset of a tougher economy helped push them toward the cloud. A majority, 58%, say economic conditions had nothing to do with it.
While Avanade didn’t seem to read anything into this, another observer, Paul Miller, thought this finding was a Avanade finds growing Enterprise enthusiasm for the Cloud, suggesting that contrary to what everyone assumes, cloud computing decisions are not being driven by cost-cutting needs:
“Also interesting was the relatively small impact of the economic situation upon Cloud adoption, with only 13% suggesting it had ‘helped’ adoption plans and 58% reporting ‘no effect.’ In my conversations with Nick Carr and others, there’s been an underlying presumption (on my part, as well as theirs) that cost-saving arguments with respect to Cloud Computing would prove persuasive and compelling. It would appear not. This would suggest, of course, that enterprise adopters are taking to the Cloud for reasons other than the budget sheet…”
If it isn’t its low entry costs, then why is cloud computing so popular? Avanade says half of the companies surveyed that have migrated to cloud computing technologies use it to “manage and deliver business applications such as customer relationship management (CRM) and human resources (HR) services.” Forty‐six percent of respondents are also using cloud computing for data storage.
Speaking of greater flexibility and agility, the Avanade survey suggests that the service model is taking over as the prevailing IT value proposition. As Avanade puts it, the “online services model is beginning to fundamentally change how IT services are consumed and provisioned in large organizations. More than half of respondents report that they are currently using Software as a Service applications. In the United States, that number increases to more than two-thirds (68 percent).”
And many of these deployments are internal cloud. Avanade says that globally, “there is a 2:1 ratio of respondents who prefer SaaS delivered internally (or as private services) versus from third-party service providers. There is an even greater dissparity in the United States, with a 4:1 ratio in favor of internal SaaS deployments.”
By Joe McKendrick an author and consultant with deep knowledge and insights regarding trends and developments in the technology industry.
Should you upgrade to Windows 7? Download the Upgrade Advisor
Wondering whether you should upgrade to Windows 7?
Find out if upgrading to Windows 7 would be beneficial for your business by contacting VNet Professionals today.
One of our qualified Sales Engineer will give you advice for you business and a free estimate.
But first, you can run a free diagnostic on your system to find out if your system ready by downloading the Upgrade Advisor. Windows 7 Upgrade Advisor scans your PC’s system, programs and devices to check if it is able to run Windows 7. The report will tell you if your PC meets the system requirements, if there are any known compatibility issues with your programs and devices, and will also provide guidance on your upgrade options to Windows 7.
Netbooks worth the upgrade to Windows 7? Heck yes!
I haven’t been a hugely vocal early adopter as far as the Windows 7 hype goes. As with many Microsoft upgrades, I’m happy to let them trickle in via new hardware after extensive testing with our critical systems. Fortunately, an extremely stable release candidate has been kicking around for quite a while and I’m not too fussed about how 7 will function with the average set of applications.
Regular readers will know that I’m also not too fussed about 7 in general, since so many good alternatives are available. As I wrote the day that 7 launched, “Windows 7 is just another OS.” Windows XP is still chugging along, most of us have figured out how to get Vista to behave nicely, OS X is slick and stable, and Ubuntu 9.10 is just around the corner. Competition is our friend, right?
There’s one place, though, where an upgrade is definitely in order and it should probably happen sooner than later: netbooks. Most netbooks ship with Windows XP Home Edition, which performs relatively well on an Atom processor, but which also lacks any sort of reasonable enterprise features. Lots of us have deployed netbooks in schools, whether through traditional vendors or in Classmate incarnations, and have simply lived with the shortcomings of XP Home (no domain join, security vulnerabilities, no management capabilities via Active Directory, lack of granular user security, etc.). Others have bitten the licensing bullet and upgraded their netbook deployments to XP Professional already.
For those who haven’t, though, and are still using XP Home, the $70 volume academic license upgrade is worth the cost of admission. $70 a pop can add up quickly, but in deployments wedded to a Windows ecosystem, the advent of a netbook-capable, solid, secure OS in Windows 7 Professional is a welcome upgrade.
Although Intel has noted in the past that Classmates and most of their software ecosystem runs well under 7, but has not released plans yet to formally support 7 on their Classmates. However, for the many non-Classmate deployments out there, assuming that Ubuntu isn’t an option for any number of reasons, schools should seriously consider leaving XP Home behind and taking advantage of the full complement of enterprise and security features in Windows 7. New tools from Microsoft should make this process relatively painless on computers without optical drives. Other migrations? They can wait for the first service pack.
By Christopher Dawson is the technology director for the Athol-Royalston School District in northern Massachusetts and a member of the Internet Press Guild.
Microsoft chops prices of its hosted enterprise cloud offerings
Microsoft is cutting prices of its Microsoft-hosted Exchange, as well as its suite of business services (known as the Business Productivity Online Suite, or BPOS), and is refunding the difference to existing hosting customers.
Microsoft is cutting its Exchange Online pricing from $10 per user per month to $5 per user per month. It also is cutting the price of the BPOS bundle — which includes SharePoint Online, Exchange Online, Communications Online and Live Meeting — from $15 per user per month, to $10 per user per month.
Microsoft is leaving the pricing for its Deskless Worker versions of its hosted Online offerings the same. Exchange Online Deskless Worker and SharePoint Online Deskless Worker remain $2 per user per month. The bundle of the two Deskless Worker offerings stays at $3 per user per month.
Not surprisingly, Microsoft officials didn’t attribute the price cut to competition from Google Apps or other hosted offerings. Instead, they attributed the cuts to “rapid customer adoption, global scale and improved efficiencies from new software such as Exchange Server 2010″ (according to the press release).
Microsoft is making BPOs available in 15 new countries before the end of the year. Later this week, BPOS will be commercially available in Singapore; trials are slated to begin in Brazil, Chile, Colombia, Czech Republic, Greece, Hong Kong, Hungary, Israel, Malaysia, Mexico, Poland, Puerto Rico, Romania and Taiwan. Commercial availability in India is also expected later this year, officials said.
Microsoft officials are now claiming to have more than 1 million paying users for Microsoft’s Online family of services (not counting Live Meeting, for which there are many more paying customers, according to company officials). Newly signed BPOS customers include Hofstra University, Lions Gate Entertainment, McDonald’s Corporation, Rexel Group, Swedish Red Cross and Tyco Flow Control.
Microsoft will be adding a paid, Microsoft-hosted version of Office Web Apps — the Webified versions of Word, Excel, PowerPoint and OneNote– to its Online stable next year. Company officials have said that paid offering will also be available to Microsoft volume-license customers so that they can host Office Web Apps themselves, on-premises, instead of or in addition to allowing Microsoft to host it for them. There will be additional (and, as yet, still unannounce) features that will be part of the paid Office Web Apps offering that aren’t part of the free, ad-funded version.
Microsoft is currently rolling out refreshes to its Online family of services every 90 days or so, according to Ron Markezich, Corporate Vice President of Microsoft Online. Some of the new features the company is rolling out to its on-premises software — such as Exchange 2010 — are debuting in the hosted, Online offerings before they are available to customers as server-based products. (The final Exchange 2010 software bits are slated to go to customers starting next week.)
I’m sure Microsoft customers will be upbeat about the price cuts for Microsoft’s hosted offerings. But I’d think Redmond’s partners who are trying to make money from selling Microsoft’s hosted services (if not their own hosted version of Microsoft’s wares) might be less enthusiastic.
By Mary Jo Foley she has covered the tech industry for more than 20 years.
The 10 biggest failures in IT history
I recently shared a list of events I believe were pivotal in shaping today’s IT industry - things like the development of COBOL and the creation of UNIX. This time around, I’ve listed a few of the biggest failures in IT - but I’ve tried to steer clear of the same ol’ items everyone has on their lists.
Note: This article is also available as a PDF download.
1: Windows Vista
What a disaster! Could Microsoft have assembled a bigger failure if it tried? Well, possibly. But Microsoft wasn’t trying to make a failure — it was trying to make the best of the best. The result was the worst of the best.
I have to qualify this entry, because NeXT did inspire a lot of software for the Linux desktop (such as AfterStep), and the NeXTSTEP did eventually become the foundation of OS X. So NeXT wasn’t a complete flop.
What is it with the capiTalIzaTion? Although BeOS has been resurrected as Haiku, the BeOS (and all the cool hardware it promised) never really got off the ground. The PC that promised to be the dream machine for the media crowd fizzled out before its fuse could really be lit.
4: Cobalt Qube
The Cobalt Qube looked cool. If you’re lucky, you can still find one on eBay going cheap. Underneath that tiny blue exterior lay a beefy 64 MB of RAM and an 8.4 Gig HD that was ready and willing to serve up your Web site, your mail, your DNS, or anything else you needed. Ah, but those were the glory days — and short-lived at that. The serious IT crowed quickly realized that function held sway over form, and the cool blue Qubes went nowhere. Even after Sun bought the Cobalt company, these devices did nothing.
I can’t resist including this one. The entire world was supposed to cave under the pressure this little bug promised, wasn’t it? I even read plenty of sci-fi books based on that premise. But nothing happened. Banks didn’t lose all of your money, the world’s security didn’t fall to pieces, and all IT professionals woke up the next morning collectively saying, “Was that it?”
I know, I know — it isn’t a flop, exactly, but the MP3 format is on this list because of all the licensing issues it has caused. On the Linux operating system alone, MP3 isn’t installed on most distributions, by default, because of licensing issues. As a result, users scramble to get MP3 support built into their various tools. This causes as much hair loss as MP3 causes audio quality loss. There are much better formats out there without the licensing issues, people!
7: Richard Stallman
This man was supposed to be the champion of open source — but he endangers open source at every turn. Instead of making ridiculous claims, RMS should stand down and let someone with a modicum of tact and sense to take over as the voice of open source software.
What I should actually place here is Corel, the maker of WordPerfect, instead of the software itself. WordPerfect was an outstanding word processing tool. Corel, however, was not outstanding in its ability to market and sell something as good as WordPerfect. So instead of a piece of software that should have single-handedly toppled the Microsoft juggernaut, WordPerfect died. This should never have happened. Any other company could have pulled off this win.
Should this already be in place? Should something so simple really be that hard? The ‘net could run out of IP addresses and there is no solution in place yet. Why? Because we don’t have the problem yet. But didn’t everyone panic with claims that the “IP sky is falling”? Wouldn’t it be smart to go ahead and put this in place? Maybe the powers-that-be are waiting until that very last IPv4 address is issued and we have to say, “We have no more!” At that point, no one will really know how to implement the solution and it will be Y2K all over again.
10: Mesh networks
At one point, wireless was going to cover the entire planet and everyone was going to have free wireless networking, thanks to wireless mesh networks. It didn’t happen. It sounded like a great idea, and sites popped up all over the place trying to get users to set up their own mesh networks to further expand the “net.” It was a grand idea, based on a grand ideal, but it just never got off the ground. That’s a shame, since a “mesh Wifi” would have enabled anyone to be online anywhere. Of course, I am sure the telecoms had NOTHING to do with the fall of mesh networking.
Microsoft does a 180 on Exchange 2007 support (in a good way)
After notifying customers and partners that Exchange Server 2007 wouldn’t be able to run on Windows Server 2008 R2, the latest version of WIndows Server, the Exchange team has reversed its decision.
In a posting on the Exchange Team Blog, Microsoft officials said they’d heard the negative feedback loud and clear. Customers didn’t want to be forced to move to Exchange 2010 before they were ready, just so as to be able to run a version of Exchange on Windows Server 2008 R2.
To fix the problem, Microsoft is prepping an update that will be out some time next year. From a November 4 blog posting by Kevin Allison, General Manager of Exchange Customer Experience:
“In the coming calendar year we will issue an update for Exchange 2007 enabling full support of Windows Server 2008 R2. We heard from many customers that this was important for streamlining their operations and reducing administrative challenges, so we have changed course and will add R2 support. We are still working through the specifics and will let you know once we have more to share on the timing of this update.”
One Exchange 2010 caveat that seemingly hasn’t changed: Users who want to run Exchange 2007 and Exchange 2010 together must upgrade to Exchange 2007 Service Pack (SP) 2.
Microsoft is slated to provide customers with Exchange 2010 final bits starting the week of November 9. Microsoft released to manufacturing Exchange 2010 in early October.
By Mary Jo Foley: who has covered the tech industry for more than 20 years.
10 Mistakes that Rookie IT Consultants Make
IT consulting is a difficult, complex industry. I’ve seen numerous competitors enter the market, only to fail. Everyone from large electronic chains (does anyone remember CompUSA’s business consulting effort or Circuit City’s Firedog initiative?) to local independents have come and gone. Despite frighteningly large marketing budgets (including symposium sponsorships, television commercials, and print advertising), complex marketing strategies, splashy fleet vehicles, and eerie team-building propaganda, competitors often fail within just months.
And there’s a reason. IT consulting is a dynamic, ever-changing industry that requires practitioners to maintain multiple skills. Rapid technological shifts frequently change the way you work, the tools you use, and the operational procedures you require. To meet that challenge and stay in the game, you must learn early on how to avoid some of the more preventable pitfalls. Here are 10 mistakes that consultants often make when they’re starting out.
Note: This article is also available as a PDF download.
1: Underestimating total project time
None of us is perfect. Unforeseen issues always arise. There are no “simple” projects. Consultants must take those issues into account when preparing project cost estimates.
The very first time I ever estimated a simple Windows Small Business Server rollout for a client with seven employees in two locations, I budgeted eight hours to “deploy the server.” In developing my estimate, I included time to unbox and install the server, set up DNS, configure the VPN, join the second location to the VPN, register the domain name, configure MX records, create data shares, set permissions, and configure and test email accounts. Let’s just say it took longer.
New consultants must be particularly careful to review project plans before settling on a final estimate that is forwarded to the client. Such estimates should be first run by veteran IT staff for feedback whenever possible.
2: Failing to properly document project scope
Why did my first server deploy take longer? In conversations with the client, when discussing the project, I was focused on the tasks associated with deploying the server. The client already had a peer-to-peer network in place. I saw my role as simply dropping the server on the network, joining workstations to the domain, configuring a VPN to give a remote but key employee data access, and introducing email.
But the client thought a “server deployment” included installing a couple network printers with network scanning functionality, upgrading Microsoft Office software on eight workstations, implementing site-wide antivirus, and other tasks. Such disconnects are the IT consultant’s fault.
Clients are not technology experts. It is the consultant’s responsibility to ensure that the client’s business needs and objectives are understood and that the technology deployed matches them. Whenever estimating a project now, I provide clients with a project plan that lists specific bullet points. I don’t just state “deploy server,” “configure DNS,” etc., as most clients don’t know what that even means. Instead, before starting a project, I go through a project plan with the client that reviews tasks I will perform and the specific functionality those tasks will provide (”Users will store their files on the server’s X drive,” All users will send/receive email using Microsoft Outlook 2007 on their desktop workstations,” “A new network printer will enable scanning documents and storing them over the network to a Z drive hosted on the new server,” etc.).
3: Underestimating hardware costs
Just as it’s easy to underestimate the time and labor required to properly complete a project, hardware costs frequently become a source of trouble. Here’s one common scenario: An IT consultant specifies a particular gigabit switch or router when assembling a project budget using a temporary price because a vendor is offering promotional pricing (and the temporary price cut may NOT be evident when researching pricing). Or a server configuration may be priced using unique components. Ten days may pass before the client approves the purchase. Then, when the consultant proceeds to order the items, the server configuration and promotional pricing (or both!) are no longer available.
I see it all the time, even with one leading Texas-based computer vendor’s promise of 30-day price locks. And I’ve yet to see one of these changes work in the consultant’s favor. Whenever preparing project estimates, always note that hardware costs are subject to change. Be sure, too, to always include shipping costs in estimates. Clients should find no surprises when receiving a final invoice, but if the consultant neglects to include shipping costs in preliminary conversations, such fees will prove problematic.
4: Trying to master all technologies
An IT consultant cannot master all the technologies clients require. It’s not going to happen. Some busy consultants will service three or four clients a day. There’s no way that consultant is going to develop comprehensive expertise with all the myriad applications clients wield, such as Dentrix (dental), Timberline (accounting), QuickBooks (financial management), Intergy (physician practice), Act (database), Prolog (project management), Aloha (restaurant), and SEMCI Partner (insurance), as well as routing platforms (Cisco, SonicWALL, WatchGuard, etc.), Windows desktop and server operating systems, antivirus solutions, Exchange email, and others.
Determine which platforms you’ll master. Then make sure you know who to call for assistance when troubleshooting problems with the remainder. Whether you’re contacting the software manufacturer or another consultant to assist when servicing a platform with which you don’t have expertise, you’re performing a service for the client. Ultimately, clients typically don’t care that you know every nuance of every program — they just want a dependable partner they can call when they encounter technology issues.
5: Waiting to send invoices
Consultants, especially those starting a new business, are particularly eager to jump on new projects. It’s seemingly best to always be billing. Given the choice between taking downtime to develop and mail invoices or go onsite to complete another service call, rookie consultants almost always favor knocking out additional service calls. But there’s no cash flow when invoices aren’t going out.
New consultants must schedule time, daily whenever possible, to write and distribute invoices. A CPA client gave me great advice. He recommended I always send invoices within a day of completing work. He told me studies reveal customer satisfaction is highest when invoices are received quickly.
It makes sense. Every day a consultant delays sending an invoice, clients forget a little more the pressing need that demanded the repair or service. When bills arrive three weeks or a month later, cash flow not only suffers, but customers are more likely to believe charges are excessive. This is because the business and operations interruptions and resulting trauma and downtime the consultant corrected have been forgotten.
6: Scheduling too many calls
When planning a typical workday, consultants should schedule one or two hours of time for every hour billed. Essentially, that means two to four service calls are the most that can be reasonably accommodated on any given day. A fair rule of thumb is that each member of an IT consultancy traveling onsite to resolve client issues should bill 20 to 25 hours per week. Any more than that, and you begin stretching resources too thin.
When scheduling client calls (I aim for four billable hours per day, which I have consistently met for years), you must include time for administrative and operational work. Numerous tasks require a consultant’s attention, including managing payroll, accounting, QuickBooks data entry, internal IT, advertising, and marketing tasks.
7: Failing to market the business
Rookie consultants, whether working for a firm they own or as an employee within a consultancy, typically strive to maximize billable hours. The desire for billable hours sometimes comes at the expense of obtaining new clients and chasing larger projects. These consultants should do more than just report to work and service existing clients. They must take time to attend BNI, chamber, Rotary, and other networking meetings. They should distribute business cards at every opportunity.
Some consultants don’t believe they have time for additional marketing responsibilities. That’s a common mistake. The fact is, many business networking events end before 8:00 AM, so there’s no excuse for new consultants not rise early and attend networking events before their regular work day begins. Recently, a longtime friend and insurance agent reminded me that, by scheduling 7:00 AM and 7:30 AM meetings every day, he’s opened an additional 250 meetings a year on his calendar. That’s impressive.
8: Overlooking travel costs
Many consultants, especially those new to consulting, don’t realize the costs of travel time. Traffic is expensive. Very.
Consider the facts. If an IT consultant charges $115 an hour for onsite commercial work, and traveling to client sites consumes just six hours a week (it’s likely much more), the opportunity cost of traffic and travel time to the consultant exceeds $30,000 annually.
Those costs must be captured. Typically, IT consultancies capture them in the form of onsite service fees, inflated first-half-hour rates, or other surcharges. Just this past week, a plumber completed work at my residence. The bill included a $35 “truck fee.” That’s nothing but fair. In addition to paying for fuel and wear-and-tear on a fleet vehicle, the plumbing shop needs to cover the time spent traveling to my home.
New IT consultants must remember to charge 30% to 40% more than their regular onsite rate for the first half-hour or simply add a flat-rate callout fee.
9: Charging too little
There’s a natural temptation, especially among new technology consultants, to believe the rates they charge are expensive. But running a business costs money, lots of it, and technology solutions are complex. Consultants must remember that their expertise, and the delivery of onsite service especially, possess great value. Hourly onsite support rates vary from $85 to $125 or more per hour. But that doesn’t mean a new consultant must charge just $85 per hour.
To the contrary. Local market conditions are usually the largest factor. The costs of delivering services is higher in Boston, where taxes, fees, parking, and other expenses are naturally higher than in Louisville, KY, where the costs of living are less. Thus, an IT consultant in Boston should expect to earn a higher hourly rate than a consultant in Louisville.
10: Working Saturdays
Technology consultants operate within a pressure-packed environment. This is likely the single greatest factor I underestimated when opening my own consulting shop almost four years ago.
Most clients don’t call for help before critical systems fail. Instead, they wait. Then they try to fix it themselves. Next, they enlist the assistance of the local computer geek on staff. Often, the consultant is called only after these efforts — and those of the business owners’ friends, colleagues, and neighbors — have failed to resolve the problem. As a result, IT consultants spend much of their time running from raging and complicated fires to blisteringly complex crises. It is fatiguing work. Many days, my technicians and I are physically and mentally exhausted by 2:00 PM.
Inevitably, clients request that consultants work weekends. I almost always say no. It’s not that I so feverishly guard my personal time. Instead, as I mature and spend more time within the industry, I’ve come to understand the importance of approaching complicated issues with a fresh mind and properly fed body (of which I’m not making light; too often my staff and I must skip lunch because of new-client crises). How many times have you struggled with a complicated Windows issue at 1:00 AM, only to quickly solve it the next morning after getting some sleep and a decent breakfast?
The same principle is true within a consulting firm. Rookie consultants must take time to help their bodies, physically and mentally, recover from the rigors of their profession. That means minimizing weekend work, for better or for worse.
By Erik Eckel is president of two privately held technology consulting companies. He previously served as executive editor at TechRepublic.
10 ways to evaluate your IT management software
Choosing the right IT management software solution is a strategic decision
that requires the careful planning and consideration of every IT department.
Many factors can influence your ultimate choice. High functionality is a must, but what about cost? Is the software portable and flexible? Is it user-friendly? Are there customization options for your specific needs? How long will it take to implement? It can sometimes be difficult to distinguish irrelevant factors from truly crucial ones. These considerations can be painstaking – after all, every IT management team dreads the risk of choosing the wrong software solution. Here are some essential criteria that industry experts have gathered to help you determine whether or not your software is getting the job done.
- 1) Quick installation and implementation
Installing your IT management software should require minimal time and resource investment. After installation, implementation should be a quick and seamless transition into immediate use. The processes of deployment, customization, and setting configurations should take no longer than a few hours. If a program is so cumbersome that it takes months to effectively implement, you’ll have wasted valuable time and energy that could have been channeled towards meaningful productivity.
- 2) Simple interface that facilitates ease-of-use
All too often, your purchase of IT management software is immediately followed by a new challenge: how does the thing even work? If your software does not provide a user-friendly interface, you’ll have to invest more time and resources into learning the program’s basics. Even worse, you might miss out on important functionalities simply because you can’t figure out how they work – or can’t find them in the program to begin with. Your program should offer an interface that is so intuitive that even your less technically-inclined end-users can use it without needing a user manual.
- 3) Modular and integrated structure for centralized access to
Effective IT management encompasses a wide range of components: helpdesk, asset management, monitoring, reports, projects, tasks, problem management, change management, and CMDB. Combining multiple tools can be incredibly complex, and your IT management software solution must not consist of “stand-alone” modules that isolate data into silos of information. Your software must provide a comprehensive and completely integrated range of tools. This unified approach provides centralized access to information and facilitates your success.
- 4) Low maintenance and upgrade costs
When evaluating the cost-efficiency of your IT management software, the price of your initial purchase is only a small component of your long-term investment. Many tend to overlook the pressing issue of hidden costs: how much is maintaining your software going to cost you? Maintenance fees can be surprisingly expensive, and if the program is complicated, you may need to hire private consultants every time you want to upgrade your system. As the effectiveness of your software depends on continued maintenance, you should have to pay no more than a small percentage of your initial purchase on an annual basis for continued maintenance.
- 5) Intuitive customization options
Customization extends well beyond implementation into the ongoing maintenance of your system. Choose IT management software solution that provides you with as many customization options as possible, and these customization options must be simple and straight-forward. You shouldn’t have to hire a team of consultants every time you want to make a customization. The point of customization is for the product to cater to your needs, and if the process takes longer than a few minutes to do on your own, it’s too complicated. Remember, you shouldn’t have to work for your software – it should work for you!
- 6) Long-term support approach
In the months and years after you purchase, your software provider should demonstrate an unwavering commitment to your satisfaction. An excellent service experience means that every issue you report is taken seriously and quickly addressed. Personal attention should be more than lip-service – it should truly be a core tenet of your product provider’s support mission. Your purchase of an IT management software solution is largely motivated to by the desire to improve your IT service quality; why shouldn’t you expect the same quality of service from the company you purchase it from?
- 7) IT professional community
A true enhancement to any IT management software is access to an active online community of other IT professionals. Whether it is learning how to perform a remote reboot or how you can implement ITIL best practices in your organization, online community forums give you indispensable tools for better IT management. Your participating can help you get the most out of your software’s capabilities. An IT community also lets yourself compare yourself to other IT professionals – see what they’re up to, what they’re concerned about, and where you stand. Staying in-the-know on the latest trends is crucial in the ever-dynamic world of IT.
- 8) Strong references
A reliable indicator of your software package’s merit is what other customers have to say about it. When evaluating your IT management solution, ask for references and testimonials. Don’t hesitate to ask for references per industry or geography if this is meaningful to your evaluation. Feedback from other IT professionals is a trustworthy reference, and it can help you reach a conclusion about what the product can do for you. If they love it, ask why. A company that boasts an especially loyal customer base is doing something right – especially when those customers are IT pros, who don’t settle for anything less than excellence.
- 9) Commitment to feedback
As an IT professional, you know what tools and features you need to maximize your IT performance. The company that provides your IT management software should listen to your input and strive to implement your ideas in future releases. Your constructive feedback can – and should – help forge the future of the product, and you must feel that your ideas are valued. More than just demonstrating a commitment to user satisfaction, a company that incorporates your feedback into its product guarantees to deliver the kind of performance you need.
- 10) Strong vision and dynamic future
IT is a world of ever-evolving innovation and technological development. When evaluating your IT management software solution, you need to ask: is this company receptive to change and improvement? What is its past, and where is it going? How many new versions does it release a year? How many features does every new version introduce? Are these features significant additions to the product’s capabilities, or are they just minor patches and bug fixes? On the support page of your product’s website, check out its release history to evaluate its direction into the future.
You shouldn’t have to settle for anything less than a superior IT management software solution. Believe it or not, you don’t have to.
Saar Bitner is the Sales & Marketing Director of SysAid Technologies Ltd, an Israeli IT software company that provides IT Helpdesk solutions to better manage IT infrastructure with greater ease and efficiency .The company has deployed its software at more than 55,000 organizations in 120 countries. For more information about SysAid Technologies, please visit www.sysaid.com.
Which antivirus is best at removing malware?
Detecting the presence of malicious code is one thing, successfully eradicating it is entirely another.
According to AV-Comparatives.org’s recently released malware removal test evaluating the effectiveness of sixteen antivirus solutions, only a few were able to meet their criteria of not only removing the FakeAV, Vundo, Rustock and ZBot(Zeus) samples they were tested against, but also getting rid of the potentially dangerous “leftovers” from the infection.
More info on the tested antivirus solutions , and how they scored:
The test, including the following antivirus solutions - Avast Professional Edition 4.8; AVG Anti-Virus 8.5; AVIRA AntiVir Premium 9.0; BitDefender Anti-Virus 2010; eScan Anti-Virus 10.0; ESET NOD32 Antivirus 4.0; F-Secure AntiVirus 2010; G DATA AntiVirus 2010; Kaspersky Anti-Virus 2010; Kingsoft AntiVirus 9; McAfee VirusScan Plus 2009; Microsoft Security Essentials 1.0; Norman Antivirus & Anti-Spyware 7.10; Sophos Anti-Virus 7.6; Symantec Norton Anti-Virus 2010; Trustport Antivirus 2009, relied on a modest malware sample, whose prevalence is however easily seen in the wild these days.
“None of the products performed “very good” in malware removal or removal of leftovers, based on those 10 samples. eScan, Symantec and Microsoft (MSE) were the only products to be good in removal of malware AND removal of leftovers. Due to the sample size, the final ratings may be generous, but we applied the scoring tables strictly. We tried to give different values for different types of leftovers, although this was very difficult in some gray area cases.
This was the first public malware removal test of AV-Comparatives and due the lack of generally accepted ways to rate malware removal abilities, we did out best to give a fair rating based on the observed overall malware removal results and to do not look / base out ratings on e.g. the deletion of the binary malware only.”
It’s worth keeping in mind that the timeliness of these comparative reviews in an ever-changing threat-scape should be consider before jumping to any conclusions. For instance, quality assurance aware cybercriminals rely on underground alternatives of the popular VirusTotal service, allowing them to pre-scan their malware releases before including them in a campaign.
- Go through related posts: MS Security Essentials test shows 98% detection rate for 545k malware samples; Does free antivirus offer a false feeling of security?; Does software piracy lead to higher malware infection rates?; Modern banker malware undermines two-factor authentication; Commonwealth fined $100k for not mandating antivirus software
The bottom line - prevention is always better than the cure, which in terms of malware means operating on an up-to-date operating system, that’s also free of third-party application and browser plug-in vulnerabilities, followed by a decent situational awareness on their current tactics, and basic understanding that the antivirus software is only a part of the defense in-depth solution.
By Dancho Danchev is an independent security consultant and cyber threats analyst, with extensive experience in open source intelligence gathering, malware and cybercrime incident response.
The real pros and cons of server virtualization
First, lets be clear: this comment is about server virtualization through ghosting - the business of using one OS to run one or more ghost OSes in lieu of applications each of which in turn is able to run one or more applications - it’s not about desktops, not about N1 type technologies, and not about containerization.
The pre-eminent examples of ghosting OSes are IBM’s zVM - an OS that originated in the late 1960s as one answer to the memory management and application isolation problems confronting the industry at the time- and VMware’s more recent rendition of the same ideas for x86.
Back then, IBM was caught between rocks and hard places: lots of people (including IBM’s own research leaders) were developing system resident interactive OSes aimed at using the computer largely as a central information switch, but its commercial customer base absolutely refused to countenance any advance on the batch tabulation and reporting model around which its management ideas had evolved in the 1920s and 30s.
Thus when the Multics design effort started at MIT in 1959/60, most of IBM’s people didn’t even know there were two sides to the argument but the research people lined up with science based computing while those who made the money for IBM almost unanimously choose the data processing side - and ten years later, after MIT’s people had first won their design battles and then lost the war (by letting data processing get control of the Multics development effort), IBM’s own fence sitting solution: VM, ended up roundly hated by nearly everyone.
Nearly everyone, that is, except people limited to IBM 360 class hardware who had no other means of achieving any kind of interactive use (this was before MTS and a dozen later solutions) - and they, essentially over the objections of IBM’s own management, made VM the success it still is.
All of which brings us to the 90s when available x86 hardware mostly wouldn’t run NT 3.51 and Microsoft’s emergency iVMS port, aka 4.0, contained a misconstrued uaf derivative known as registry that effectively limited it to loading one application at a time - thus forcing buyers to choose between a lot of downtime or rackmounts of dedicated little boxes.
The rackmounts won - at least for a few years; but then data processing got took control of the wintel world and VM, in the VMware incarnation of its ideas, soon became the preferred tool for reducing the rackmount count in the name of their professional holy grail: higher system utilization.
Unfortunately there are two big problems with this:
- first, NT 4’s limitations went away with NT 4 - addressing them today
with VMs achieves a level of absurdity no audience would accept in musical
comedy -it’s right up there with using a licensed terminal emulation on a
licensed PC to access a licensed server running a licensed PC emulation;
it is very nearly a universal truth that every gain data processing
makes in improving system utilization produces a larger loss in IT
productivity for the business paying them to do it.
The reductio ad absurdum example of the latter is Linux running under VM on a zSeries machine: data processing can get very close to 100% system utilization with this approach, but the cost per unit of application work done will be on the order of twenty times what it would be running the same application directly on Lintel; and every variable in the user value equation: from response time to the freedom to innovate, gains a negative exponent.
You can see the latter consequence in virtually every result on benchmarks featuring some kind of interaction processing. For example, the Sun/Oracle people behind their recent recent foray into TPC/C, both demonstrated their own utter incompetence as IT professionals by achieving less than 50% CPU utilization and the user value of this “failure” by turning in response times averaging roughly one seventeenth of IBM’s:
|IBM p595 Avg Response time in seconds at 6,085,166 tpmC||Sun T5440 Avg Response time in seconds at 7,717,510.6 tpmC|
|Values from the detailed reports at http://www.tpc.org/tpcc/results/tpcc_perf_results.asp|
The counter argument I usually hear about all this is that virtual system images are more easily managed than real ones - and this is both perfectly true and utterly specious.
It’s perfectly true that VM style virtualization lets you bundle an application with everything it needs to run except hardware, and then move that bundle between machines at the click of an icon; but the simple fact that this applies just as well to Solaris containers as it does to VM ghosts shows that this is an argument for encapsulation and application isolation, not for ghosting.
Worse, the argument is completely specious because it bases its value claims on two demonstrably false beliefs: first that the only alternative is the traditional isolated machine structure, and second that virtualization lets the business achieve more for less. Both are utter nonsense: Unix process management has worked better than VM since the 1970s, and because virtualization adds both overheads and licensing it always costs more to do less than modern alternatives like containerization or simply letting the Unix process management technology do its job.
Again the quintessential example of this is from the heart of the data processing profession: when you take a $20 million dollar zSeries installation and achieve a 60 way split to produce 100% system utilization from 60 logical machines running applications or ghosts, what the business gets out of it is roughly equivalent to what it would get from four Lintel racks costing a cumulative $500,000.
A more down home illustration is provided by VMware itself - their competitive value calculator computes a cost advantage for their products over those from others on the basis of their belief that their VMs impose less overhead and allow you to get closer to 100% hardware utilization. Thus if you enter values saying you’ve got 200 applications running on NAS connected quad core servers, want to manage virtually, and have average infrastructure costs, they produce a table with this data:
|VMware vSphere 4: Enterprise Plus Edition||Microsoft Hyper-V R2 + System Center|
|Number of applications virtualized||202||205 (inc. mgmt VMs)|
|Number of VMs per host||18||12|
|Number of hosts||12||18|
All of which should raise a couple of questions in your mind:
- first, if the consensus that ghosting doesn’t have significant overhead
is right, where is VMware getting the third of the box it claims you can
recover by getting its ghosting software instead of Microsoft’s?
and, second, wouldn’t the money VMware wants you to spend on ghosting
($241K in this example) be better spent on hiring people who can move these
applications to free environments like Linux or OpenSolaris?
So what’s the bottom line? Simple: the real ghost in ghosting is that of 1920s data processing - and the right way to see this particular con job for the professional cost sink it is, is to focus on costs to the business, not ideological comfort in IT.
By Paul Murphy (a pseudonym) is an IT consultant specializing in Unix and related technologies.
Software vulnerability assessments and patch management are essential to the overall security of business information and data. Find out how this free trial download of an industry-leading vulnerability scanner can assess and prioritizes your vulnerabilities by criticality and deliver actionable information through an intuitive user interface, where users can easily create a variety of PDF-based reports about vulnerabilities for your OS, applications, policies and security configurations.
Clean install with Windows 7 upgrade media? Get the facts!
Last week I complained about Microsoft’s shoddy documentation of how its upgrade procedures are supposed to work. I’m delighted to report that I got a tremendous and immediate response from within Microsoft, offering assistance in my testing and also promising to clean up and expand their documentation. I spent most of the weekend working on a table that I’ll publish later this week. I’m also testing various upgrade scenarios to see which ones work and which require a workaround.
Meanwhile, an argument that should have died ages ago has reared its head again. If you purchase a discounted upgrade edition of Windows 7, can you use it to perform a clean installation of the operating system on a PC that doesn’t currently have Windows installed?
The answer is really simple. If you qualify for an upgrade license, then yes, you can use any number of workarounds to install the operating system legally. If you don’t qualify for an upgrade license, then those same workarounds might technically succeed, but your license is not valid. Will you get away with it? Probably. But if you’re running a business, you run the risk that an employee will turn you in to the Business Software Alliance, which could lead to an audit, civil charges, and eventually some stiff penalties.
Let me see if I can help uncomplicate things.
The overwhelming majority of PCs are sold with Windows preinstalled by an Original Equipment Manufacturer (OEM). The rules are in the license agreement that you see when you first turn on that PC. You can find any license agreement for Windows (retail or OEM) at the Microsoft Software License Terms page. If you read the retail and OEM license agreements, you will see that there is absolutely no requirement to install the software in a specific way. Here, for example, are the details from the OEM license agreement for Windows Vista Home Basic/Home Premium/Ultimate. I have used bold type to emphasize key terms.
Section 2: “The software license is permanently assigned to the device with which you acquired the software. That device is the ‘licensed device.’ A hardware partition is considered to be a separate device.”
[In Windows 7, the language is slightly clearer: “The software license is permanently assigned to the computer with which the software is distributed. That computer is the ‘licensed computer.’”]
Section 13: “To use upgrade software, you must first be licensed for the software that is eligible for the upgrade.” [This identical language appears in Section 14 of the Windows 7 license.]
Section 14: “Proof of License: If you acquired the software on a device, or on a disc or other media, a genuine Microsoft Certificate of Authenticity label with a genuine copy of the software identifies licensed software. To be valid, this label must be affixed to the device or appear on the manufacturer’s or installer’s packaging. If you receive the label separately, it is invalid. You should keep label on the device or the packaging that has the label on it to prove that you are licensed to use the software. If the device comes with more than one genuine Certificate of Authenticity label, you may use each version of the software identified on those labels.” [This text appears in the Windows 7 license in Section 16, with the word “device” replaced by the word “computer.”]
That sticker on the PC is the proof of your original full license, the one that qualifies you for the discounted upgrade to a new version. There is NO requirement in the license agreement or elsewhere that the qualifying software be installed first for the upgrade to be valid.
Finally, there’s the question of what older Windows versions qualify for an upgrade to Windows 7. The answer is on the retail upgrade box: “All editions of Windows XP and Windows Vista qualify you to upgrade. … If you are upgrading from Windows XP, you will need to back up your files and settings, perform a clean install and then re-install your existing files, settings, and programs.”
Here’s a picture. Note that it specifically says “clean install,” not “custom install.”
So what are the rules? Let’s break it down by some specific situations:
You originally purchased a PC with a copy of Windows XP or Windows Vista. You qualify for an upgrade on that specific PC. Any version of XP or Vista qualifies for an upgrade to any version of Windows 7. So if you bought a Dell in 2007 with Windows XP Home preinstalled, you can buy a retail upgrade of Windows XP Professional and install it on that PC. This is true even if along the way you wiped the hard disk clean and installed a beta of Windows 7. The license for Windows XP was permanently assigned to that machine when you first turned it on and accepted the license agreement. The fact that the original operating system isn’t currently installed on the PC is irrelevant.
You just bought a brand-new Mac and you want to use Boot Camp to install Windows 7 on it. You do not qualify for an upgrade license. Apple didn’t sell you a copy of Windows with your Mac, so there is no original Windows edition to qualify for an upgrade license. From a contractual point of view, you must purchase a full license to install in the Boot Camp partition.
You installed virtualization software on your PC or Mac and you want to run Windows 7 in a virtual machine. You do not qualify for an upgrade license. A virtual machine is considered a separate PC. In fact, Section 3(d) of the Windows 7 Professional license agreement makes this explicit: “Use with Virtualization Technologies. Instead of using the software directly on the licensed computer, you may install and use the software within only one virtual (or otherwise emulated) hardware system on the licensed computer.” Because there is no previously licensed version of Windows XP or Vista in your newly created virtual machine, you do not qualify for an upgrade. The exception, of coufrse, is Windows XP Mode in Windows 7 Professional and higher.
You have Windows XP or Windows Vista on your current PC and you want to use Windows 7 on a separate partition as a dual-boot machine. You do not qualify for an upgrade license. Refer back to the previous wording in Section 2 about a hardware partition being a separate device. The Windows 7 license agreement covers this in Section 14: “Upon upgrade, this agreement takes the place of the agreement for the software you upgraded from. After you upgrade, you may no longer use the software you upgraded from.”
You built your own PC from parts and you want to install Windows 7 on it. You do not qualify for an upgrade license. You need a full retail license. (You can also use a System Builder OEM license, but that’s a separate issue I’ll cover later.)
As I said earlier, this stuff is irrelevant to most people. If you buy a new PC with Windows on it from a legit dealer, you don’t have to think twice about licensing. If you buy a retail upgrade and install it on your system that’s currently running XP or Vista, you also have no hassles except those associated with upgrading.
The real people this information applies to are two groups: PC experts who support other PC users or want ultimate control over their own PCs, and people trying to get a bargain. For either group, it pays to understand the rules.
I’ll have several follow-up posts on the whole messy licensing issue. Stay tuned.
By Ed Bott is an award-winning technology writer with more than two decades’ experience writing for mainstream media outlets and online publications.
Seven perfectly legal ways to get Windows 7 cheap…or even free
If you’ve read any reviews of Windows 7, you’ve seen references to its price list, which ranges from $120 for a Home Premium upgrade to $320 for a fully licensed copy of Windows 7 Ultimate.
Well, guess what? You don’t have to pay that much. Most people have much better options available, if you know where to look. As I’ve detailed here, the best deals go to PC manufacturers, which you benefit from if you buy a new PC.
But there are plenty of other discounts available as well. In this post, I’ve researched deals in three separate categories: upgrade offers available to anyone, special deals just for students, and subscriptions intended for technical professionals and developers.
Most of the details I include here apply to Windows customers in the United States, but some offers are also available in other countries. Where possible, I have tried to track down those details and include the names of countries where equivalent offers exist. If you live outside the U.S., follow these links to find prices and terms for your country.
My goal in this post is to point you to deals that customers legitimately qualify for. I am not trying to encourage attempts by anyone to get away with something you’re not entitled to. If there are restrictions for a specific offer, I’ve noted them here.
[Update 6-Nov 1:00PM PST: Several people in the comments have asked why I didn’t iunclude the Microsoft Action Pack in this post. Two reasons: First, it is available only to bona fide system builders, and that’s a fairly small group of people. Second, and more importantly, the licenses it includes expire and must be decommissioned if you fail to renew your MAP agreement each year. Every other example I have here includes Windows licenses that are good in perpetuity. I will cover System Builder pricing and licensing in more detail next week. Stay tuned.]
Ready to get started? Pick a category and go.
Page 2: Upgrade offers You can save as much as 58% off the regular cost of a Windows 7 upgrade if you know how to buy smart. I’ve found three options.
Page 3: Special deals for students If you’re enrolled in a college or university, even taking a single course at your local community college, you can get Windows 7 Home Premium or Professional for $30. Students in technical or design majors can get Windows 7 (and many other Microsoft programs) for free if their university or college is signed up for the right programs.
Page 4: Windows (and much more) by subscriptions Are you an IT pro, a Windows enthusiast, or a professional developer? For a surprisingly low annual fee, you can get access to a staggering amount of Microsoft software, including every version of Windows or Office. There are some restrictions, so be sure to read the details carefully.
Up to 55% off: Windows Anytime Upgrade
Who’s eligible: Anyone running Windows 7 Starter, Home Basic, Home Premium, or Professional
If you custom-build a new PC, you can choose the exact Windows 7 edition you want on it. OEMs get the best pricing, so this is usually your best option. But if you purchase a preconfigured PC from an online or local retailer, you get whatever edition of Windows they chose to install on it, typically Windows 7 Home Premium for consumer PCs. Outside of the U.S., Western Europe, and other developed markets, you might get Home Basic, and on a netbook you can get the wimpy Starter edition.
Purchasing a full retail upgrade is one option, but the Anytime Upgrade option can be much cheaper. For instance, a retail upgrade of Windows 7 Professional costs $199.99. If you have a PC with Windows 7 Home Premium already installed on it, you can buy the Anytime Upgrade option for $89.95 direct from Microsoft. Likewise, you can go from Windows 7 Home Premium to Ultimate for $139.95, which is a considerable savings over the $219.99 retail upgrade price for Ultimate. (The full price list is here at the Microsoft Store.) Online retailers like Newegg.com offer the same deal for a discount of a few bucks, although you have to wait for a physical box to be shipped.
Up to 58% Off: Windows 7 Home Premium Upgrade Family Pack
Expires: “Limited time offer” with no specific expiration date
Who’s eligible: Any multi-PC household (international)
If you have two or more PCs in your home and you want to upgrade them to Windows 7, this deal is for you. This package is only available in a physical box and (according to Microsoft) only for a limited time. It includes two DVDs: one copy each of the 32-bit and 64-bit Windows 7 Home Premium upgrade installation media. You get a single product key that can be activated on up to three different PCs.
In the United States, I found the Family Pack at the Microsoft Store for $150, but you should be able to pick it up elsewhere for a discount of at least $10. Even if you only use two of the licenses and thus pay an average of $75 apiece, this is a big savings over two single upgrade copies at $120 each. If you use all three upgrades, the cost per machine is $50 or less.
According to Microsoft, this offer is also available in Japan, Canada, Germany, the UK, France, the Netherlands, Switzerland, Austria, Ireland, Luxembourg, and Sweden.
The license says you can install Family Pack upgrades on up to three PCs in the same household, for use by residents of that household. When I asked Microsoft whether it was OK to use this license in a home business, I was told, officially, “There is no restriction around use of a license for business purposes conducted within the home,” although naturally they recommended Windows 7 Professional for those situations.
Nothing in the license prevents you from mixing and matching the 32-bit and 64-bit versions on up to three PCs in your household. But no, you can’t share licenses with your neighbor or your cousin in Peoria.
Up to 50% Off: Buy a new PC, upgrade your old PC for half off
Expires: January 2, 2010
Who’s eligible: Anyone who buys a new PC with Windows 7 from a participating retailer
Microsoft has publicized this deal on its website, but retailers seem a little shy about promoting it. When you buy a new desktop PC or laptop with Windows 7 included, you can buy a second upgrade copy of Windows 7 for use with another PC at a discount. The estimated price for a copy of Windows 7 Home Premium is $49.99, Windows 7 Professional is $99.99, and Windows 7 Ultimate is $119.99.
According to Microsoft, the following merchants in the United States are participating: Fry’s, Newegg.com, Staples, Office Depot, Costco, Best Buy, Radio Shack, Amazon, Tiger Direct, Walmart, Buy.com, and The Microsoft Store.
If you go to Newegg, you’ll find the offer available as a Combo Deal with individual PCs. So, for example, if you buy a Toshiba Qosmio X505-Q830 you can pick up a second boxed retail upgrade of Windows 7 for $70-100 off. I didn’t see any mention of the offer in this week’s local ad for Best Buy. Maybe a salesman would offer me this deal if I shopped at a local store.
Amazon.com offered the deal on this page, but I didn’t get any clue or pointer to this offer when I added a new PC to my shopping cart, and the promotional discount wasn’t applied to my order until I was ready to check out.
If you’re planning to buy a new PC anyway, this deal is worth it, but you might have to be persistent to get it.
Up to 85% off: The Windows 7 Academic Offer
Expires: January 3, 2010
Who’s eligible: College/university students (international)
If you are a an eligible university student who attends an educational institution in the United States, you can purchase an upgrade edition of Windows 7 Home Premium or Windows 7 Professional for $29.99. (That’s a huge savings from the regular price of $119.99 or $199.99, respectively.) You must be “actively enrolled in at least 0.5 course credit.” Full terms for the U.S. offer are here. Any college or university that gives you a .edu address qualifies, as do the eligible institutions on this list. If you don’t have a qualifying e-mail address, you can still apply by following these instructions. To apply in the United States, start here.
According to Microsoft, similar offers are also available in Japan, Canada, Germany, the UK, France, the Netherlands, Switzerland, Austria, Ireland, Luxembourg, and Sweden.
Limitations? The deal is one copy per student. Digital download is fulfilled through Digital River, or you can pay $13 extra for a physical disk. The offer is non-transferable, but the terms are curiously vague about whether you can sell or give away the software itself. This is not an academic or otherwise restricted license; it is the same upgrade package available via retail outlets.
Free: MSDN Academic Alliance
Expires: No expiration date
Who’s eligible: College/university students in technical departments (international)
If you are enrolled in a science, technology, engineering, or math department at an educational institution that belongs to the MSDN Academic Alliance, you can get free software for use in your studies. (There are also similar offers for students in visual, illustration, design, and art departments.) The program also extends to members of IEEE and ACM. The list of available titles originally included Windows 7 Professional, but when word spread of this benefit, both organizations suddenly had a flood of new membership requests, virtually all of them from non-students looking for a freebie. That inspired this announcement from Microsoft’s Academic Care blog.
The release of Windows 7 through these subscriptions triggered an unanticipated situation that put the program at risk: We saw signs that non-students were joining ACM and IEEE as student members solely to obtain Windows 7 through MSDN AA. This infringed on the intent of the program and the conditions of the MSDN AA license. As a result, we decided to remove Windows 7 from the association MSDN AA memberships while we evaluate approaches to ensure that the offering is reaching only the target audience: students and educators. While we expect to have a final position on the matter resolved in the near future, we cannot guarantee that Windows 7 will be available through this associations due to the complexity of student enrollment verification.
So, here’s the bottom line: If you want to join IEEE or ACM, you won’t get a free copy of Windows 7. But if you’re a student in a technical or design course of studies, you might qualify and you should aggressively pursue your right to this benefit. You can find out whether your school is eligible by searching here. If you’re an English or Political Science major or a non-student, you should look elsewhere.
Annual subscription: TechNet Plus
Expires: No expiration date
Who’s eligible: Anyone (international)
If you’re an IT pro, technical professional, journalist, or hobbyist, Microsoft has a program called TechNet Plus designed to give you access to a wide range of evaluation software for a single annual subscription fee. The price varies by country, and also by whether you’re purchasing as an individual or on behalf of an organization. In the United States, the price is $349 for the first year and $249 annually for renewals. (Both of those prices are for download-only access; if you want DVDs shipped to you, you’ll need to pay a higher price.)
What you get for that price is access to a staggering amount of software, including just about every version of Windows (desktop and server) ever made, along with past and current editions of Microsoft Office, developer tools, servers, and much more. You get multiple activations for most products – typically 10 product keys for every Windows and Office edition. You also get access to premium Microsoft support: two complimentary incidents per year.
The software and accompanying product keys don’t expire. So if you decide next year not to renew your subscription, you can continue to use the software and keys you downloaded.
So what’s the catch? Read the license agreement carefully! This software is NOT for use as a replacement for licenses on PCs you use at home or work. Here’s what the FAQ says:
The license grants installation and use rights to one user only, for evaluation purposes, on any of the user’s devices, this may include devices at home. Keep in mind that you may use the evaluation software only to evaluate it. You may not use it in a live operating environment, a staging environment, or with data that has not been sufficiently backed up. You may not use the evaluation software for software development or in an application development environment.
For technical professionals who evaluate hardware and software professionally, or for hobbyists who want to play around with new technologies, this is a tremendous deal.
Annual subscription: Microsoft Developer Network (MSDN
Expires: No expiration date
Who’s eligible: Anyone (international)
The terms and benefits of an MSDN subscriptionare generally similar to those offered to TechNet subscribers, with a few crucial differences. The biggest difference is that MSDN is specifically intended for professional software developers. An annual subscription gives you access to a wide range of professional developer tools and pre-release products.
Every MSDN subscription includes access to the latest version of Windows with multiple activations. You can choose from different levels of MSDN subscriptions. The cheapest is the MSDN Operating Systems subscription, which costs $699 for the first year and $499 for renewals. It offers full access to Windows, toolkits, and SDKs. Prices go up for other editions: $999 ($649 renewal) for an Expression Professional subscription, for example, which is intended for designers and web developers and includes Windows, Office, Expression Studio, and Visual Studio Standard Edition.
Unlike TechNet licenses, which are strictly for evaluation, an MSDN Premium subscription specifically permits you to install and use one copy of the latest edition of Microsoft Office (currently Office Ultimate 2007), Project, SharePoint Designer, Visio Professional, and Office Communicator “for General Business Use … on one machine for any purpose.”
The MSDN license agreement is detailed and worth reading in full. There’s an excellent summary of your rights as a subscriber here. This paragraph is especially noteworthy:
Many MSDN subscribers use a computer for mixed use—both design, development, testing, and demonstration of your programs (the use allowed under the MSDN Subscription license) and some other use. Using the software in any other way, such as for doing email, playing games, or editing a document is another use and is not covered by the MSDN Subscription license. When this happens, the underlying operating system must also be licensed normally by purchasing a regular copy of Windows such as the one that came with a new OEM PC.
If you’re a professional developer or designer who uses Microsoft products, MSDN subscriptions can be a bargain. If you just want cheap access to Windows 7, you have better options.
By Ed Bott is an award-winning technology writer with more than two decades’ experience writing for mainstream media outlets and online publications.
Top 10 reasons to upgrade from Windows Vista to Windows 7
You told us you wanted your PC to be safer, more reliable, and more responsive. We designed Windows 7 to simplify the things you do every day, work the way you want, and make some exciting new things possible. Want examples? Here are 10 good reasons to make the move to Windows 7.
Free Guide: Seven Tips and Tricks For Windows 7
Seven Tips & Tricks For Windows 7—Part 1”>Seven Tips & Tricks For Windows 7—Part 1
Get on your way to becoming a Windows 7 power user with these new features and shortcuts for Microsoft’s latest operating system. Get on your way to becoming a Windows 7 power user with these new features and shortcuts for Microsoft’s latest operating system.
More Laptops Combine Core i7, Windows 7
Lenovo is the latest to add the Core i7 processor to its laptop lineup. The IdeaPad Y550P is one of several consumer laptops and desktops for Windows 7 that Lenovo announced yesterday. Intel first introduced Core i7 mobile processors, based on its Nehalem microarchitecture, in late September, but they remain high-end chips with list prices ranging from $364 to more than $1,000. The vast majority of laptops still use Core 2 Duo processors, or AMD Athlon or Turion chips. The arrival of Windows 7, however, has unleashed a wave of new notebooks including more Core i7 models.
The IdeaPad Y550P is an entertainment laptop with a 15.6-inch display and Nvidia discrete graphics. Lenovo hasn’t announced the final specs but should start around $1,149 with the 1.60GHz Core i7-720QM. Lenovo announced two other new laptops, the U550, a thinner 15.6-inch mainstream model, and the 11.6-inch IdeaPad U150 ultraportable. Both use Core 2 Duo processors, and the U550 has switchable graphics (integrated and discrete). The U150 and U550 will starts at $585 and $650, respectively. Lenovo does not offer a quad-core processor on its ThinkPad business line with the exception of the 17-inch W700, a mobile workstation that offers Core 2 Quad processors, but not Core i7. Lenovo also unveiled three Windows 7 desktops–only two of which, the IdeaCentre B500 and K300, will be available in the U.S. The most interesting is the B500, an all-in-one with a 23-inch 1920×1080 display, Core 2 Duo E5400, 2GB of memory, Nvidia GeForce G210M graphics with 512M and 320GB hard drive. Final pricing hasn’t been set, but Lenovo’s release stated the B500 should start around $649. Last week, Lenovo announced Windows 7 updates for two laptops for small and medium-size businesses, the 14-inch ThinkPad SL410 and 15.6-inch SL510, but these are available only with Core 2 Duo processors.
HP is now offering Core i7 processors on several “Quad Edition” models including the 15.6-inch Pavilion dv6t, 17.3-inch dv7t and 18.4-inch dv8t. The lowest priced is the $999.99 dv6t with 1366×768 resolution display, 1.60GHz Core i7-720QM, 2GB of memory, Nvidia GeForce GT 230M graphics with 1GB and a 250GB hard drive. At the opposite extreme, HP’s premium Envy 15 is also equipped with a 15.6-inch display but with a higher resolution (1920×1080) and ATI Radeon HD 4830 graphics. It starts at $1,799.99 with the same Core i7-720QM processor, 6GB of memory, and a 500GB hard drive. The 1.73GHz Core i7-820QM adds $400 to the price. Unlike the Pavilion dv6t, the Envy 15 does not have an internal optical drive to cut down on the weight and thickness, though you can purchase an external DVD burner or a combo drive that can also play Blu-ray discs.
Acer does not yet offer a model with a Core i7 processor. Gateway, which is a division of Acer, announced an new EC series of laptops, including a 15.6-inch model, but all of the new models use a 1.30GHz Pentium SU4100 dual-core processor, one of Intel’s ULV chips designed for long battery life rather than high performance. That puts the $649.99 Gateway EC5409u more in direct competition with other thin 15.6-inch models such as Dell’s Inspiron 15z and Lenovo’s IdeaPad U550.
Dell has several models that now include the Core i7. The Studio 15 starts at $999.99 with a 15.6-inch display (720p), 1.60GHz Core i7-720QM, 4GB of memory, ATI Mobility Radeon HD 4570 graphics with 512MB and 250GB hard drive. It competes directly with the Pavilion dv6t. The Studio XPS 16 is a higher-end model with a 1680×945 display, ATI Radeon HD 4670 graphics with 1GB and a 500GB hard drive. It starts at $1,399 with the same Core i7 chip. Dell’s desktop replacement, the Studio 17, also has a Core i7 720QM starting at $1,099 (17.3-inch display, 4GB of memory, ATI Radeon HD 4650 with 1GB and a 250GB hard drive). Finally, the Alienware m15x, a gaming rig, is one of the few 15.6-inch laptops with the full menu of Core i7 processors including the 2.0GHz Core i7 920XM–the fastest mobile processor currently available.
Last week Apple announced an iMac refresh that included Core i7 and Core i5 processors on its 27-inch model, but the company does not yet offer a laptop with a Nehalem processor. There are rumors that a MacBook Pro refresh may be just around the corner, though perhaps with the upcoming Arrandale Core i3/Core i5 processors.
Managed services model requires commitment from all levels
The use of managed services is set to increase significantly over the next five years. However, although many organizations have developed sourcing strategies, they have lacked the discipline of managing third-party relationships as closely as they should. This needs addressing as commitment is required from all levels to make the managed services model work.
The management of a managed services agreement should not be left to chance. Organizations must understand that they need to commit time, effort, and money to looking after these contracts and this is usually undertaken by having an in-house management team.
Managed services must be part of a considered sourcing strategy.
Organizations of all sizes are today making more intelligent sourcing decisions, enabling the procurement of discrete services to be a part of an overall organizational sourcing strategy. The benefit of increased business and financial flexibility, through the reduced need to commit to a long period of involvement with a single supplier, can make the proposition attractive to SMEs and larger organizations alike, although the importance of commitment from all levels within the customer organization should not be forgotten.
Some organizations look towards outsourcing to assist them in achieving their business objectives, and some prefer to retain the delivery of IT services in-house. However, an increasing number of organizations are not only using outsourcing to complement the services they are delivering in-house, but also implementing a multi-sourced model - that is, having a number of different providers deliver various aspects of IT requirements.
Large enterprises, in particular, are bringing back the ‘mega-deal’ - these are long lists of outsourcing requirements, but within these lists are smaller sets of requirements, many of which can be individually delivered through managed services. Essentially, where once characterized by the ‘one-stop-shop’ approach to outsourcing, large and enterprise-class organizations are requesting the delivery of more discrete managed services, which overall is a more effective sourcing strategy.
The benefit of increased business and financial flexibility, through the reduced need to commit to a long period of involvement with a single supplier, can make the proposition attractive to SMEs and larger organizations alike. However, customers need to retain control of their managed services contracts, through the provision of an in-house management team.
Marketplace for managed services still quite fragmented Network, voice/data convergence, and security services are the highest growth markets for managed services, being driven by the need to roll out next-generation networks and secure them (a key activity for many organizations), in order to ensure that they are subsequently able to attain the many business benefits.
L.A. votes to “Go Google”; Pressure Shifts to Google and the Cloud
The Los Angeles City Council today voted unanimously to “Go Google,” approving a $7.25 million contract to outsource the city’s e-mail system to Google’s cloud and transition some 30,000 city employees to the cloud over the coming year, according to a report in the Los Angeles Times.
Clearly, this is a big deal for the city of Los Angeles. But this vote is also monumental for cloud computing as a whole, which has gained popularity and widespread interest but still relatively little adoption as companies - and municipalities, apparently - weigh the anticipated cost benefits over the unknown risks that might come with system failures or data breaches.
The stakes are also high for Google, which has stepped up its campaign for Google Apps, its cloud-based suite of offerings, by highlighting how companies who are fed up with breakdowns and costs of maintaining old legacy systems finally decided to “Go Google.”
Both Google and Microsoft had put in bids for the city’s contract and, at one point, it seemed to be a showdown between the two, representing a bigger winner-take-all battle between old school systems and 21st Century cloud systems. In a post last month, I suggested that a win for Microsoft would show that Outlook and Exchange are still big players and that a win for Google would show that the cloud is ready for prime time.
This doesn’t necessarily mean the beginning of the end for Microsoft in this space. Los Angeles is just one city on this planet - and it’s only 30,000 city employees. But Google clearly has its sights set on the enterprise for the next wave of growth, even to the point that it could overtake - or nicely complement - the advertising business.
At the Gartner IT Symposium 2009 in Orlando earlier this month, Google CEO Eric Schmidt said the largest number of seats for Google runs about 30,000 users and that goal right now is to gain users for its enterprise apps. He sees the enterprise as “humongous,” a multi-billion dollar business that has real potential. By Gartner’s calculations, enterprise accounts for about 3 cents of every dollar that Google makes, leaving plenty of room for growth.
That growth could come from the countless other municipalities, agencies and companies that have been toying with the idea of a move to the cloud but have held back, waiting for someone else to jump off the cliff first.
SMBs: Save Time & Money with SaaS
In today’s difficult economy, companies of all sizes are looking for ways to cut costs and save money. But if your Small Medium Business (SMB) is like most, the belt-tightening has been even more severe, with all but the most critical capital expenditures receiving close scrutiny.
To alleviate some of this economic stress, many SMBs today have turned to Software as a Service (SaaS) to save on their mission-critical business applications. Could SaaS be a solution to some of your company’s financial challenges?
I personally think it is a great solution for companies that constantly have the need to upgrade their business applications, for projects or departments were the work is seasonal or for those instances that were the deployment is just too much of an investment. One of the great benefits is that the service for all SaaS applications is that the services is included and eliminating the heavy upfront cost and burden of integration.
New Mexico State Attorney General’s Office Goes Google
James Ferreira, CIO for the New Mexico State Attorney General’s Office, had a choice to make to support his growing organization: upgrade to a more costly enterprise license for Microsoft Exchange or find a business grade alternative at a better price. After extensive research, Ferreira found Google Apps Premier Edition to be the perfect solution.
Join this live TechRepublic Webcast to ask James Ferreira questions about his migration from Microsoft Exchange to Google Apps Premier and learn how your organization can plan for a seamless transition to Google Apps.
Microsoft Shipping Schedule
The shipment schedule for your software benefits is changing:
Over the next twelve months Microsoft will be releasing one of the largest waves of innovation into the marketplace in its history, starting with WindowsR 7. We want to ensure that you gain access to these products as soon as possible. For this reason, we’re adjusting the shipment schedule of your software benefits to better align with the major product launches. You can then more quickly take advantage of the new development, sales, and services opportunities that these products provide.
When the shipment schedule changes will take effect:
The changes will take effect immediately with your July shipment, which will be sent to you in September. This shipment will include Windows 7.
Digitally access your software benefits:
We would like to remind you that all partners may digitally download these new products immediately upon availability-often 4-6 weeks before your shipment arrives. You can go here to access the software download site and valuable resources to help you get started. Note: WindowsR 7 Enterprise will be available for download by mid-August.