Arm unveils chip to make smartphones faster, cooler

Arm Holdings PLC today introduced a processor for smartphones and other devices that it says will be cheaper, and more powerful and energy-efficient than the two ARM processors it will supplant. The new Cortex-A5 processor will come with one to four processor cores running at up to 1GHz. Based on an efficient 40-nanometer design, the Cortex-A5's cores will run up to 3 times faster than each core in the company's low-end ARM9 product, according to the Cambridge, England-based firm. The company expects that the new chip will also help it ward off incursions from Intel Corp's rival Atom chip . ARM announced the chip Wednesday at its annual ARM TechCon3 conference in Santa Clara, Calif.

The Cortex-A5 also consumes one-third the power of ARM's mid-range ARM11 processor, meaning that it can extend a device's battery life "by at least double, if not more," said Travis Lanier, a product manager at Arm. The Cortex-A5 is due to be released to ARM's 600-plus manufacturing partners in December, the company said. The manufacturing cost of the new chip is about one-fifth what it costs to make the ARM9, which has been installed in more than 5 billion cellphones and other devices, or the ARM11, which is used in Apple's iPhone and iPod Touch. Meanwhile, Intel is readying a 32-nanometer version of the popular Atom netbook processor . The 32nm chip will be the first "low-leakage" version of the Atom, and is expected to be similar to the current Arm CPUs in terms of compactness and power efficiency, making it suitable for mobile phones and tiny devices. The supported software includes Ubuntu Linux, Android and the Firefox browser.

Arm says that the company expects the Cortex-A5 to maintain its edge over the Atom. "This is a very, very tiny processor," Lanier said. "It will be many generations before the Atom can compete with it." The Cortex-A5 will be able to run all of the software that runs on the other Arm Cortex processors - the Cortex-A8 and the Cortex-A9, Lanier said. The Cortex-A5 also supports technology from Neon multimedia technology that can be used to improve video performance. Such devices should begin to appear with Cortex-A5 chips in 2011, Lanier said. Arm expects the Cortex-A5 to be used in low-end to mid-range cellphones, smart appliances such as refrigerators, washing machines and clothes dryers and digital picture frames. At that point, the ARM9 and ARM11 chips will begin phasing out.

The company recently announced that its dual-core Cortex-A9 will be able to run as fast as 2 GHz, enabling it to run in ARM-based netbooks, or smartbooks, laptops and even desktop PCs, says Lanier. The Cortex-A8 and Cortex-A9 will continue to be Arm's mid-range and high-end processors. Lanier said that the Cortex-A5 will come in two flavors: a general-purpose processor running at 1 GHz that consumes about 80 milliwatts; and a low-power, very efficient chip that runs at 500 MHz. "There's almost no [electrical] leakage," Lanier said. That is more suitable for phones, as battery life tends to be key.

Facebook Captchas broken?

Hackers have apparently found a way to automate the creation of new Facebook profiles by breaking the challenge-response mechanism used by the site to ensure that only humans sign up for the service. The pages are being used to spam links pointing to malicious sites. Security researcher Roger Thompson, of AVG Technologies, today said his company in recent days discovered numerous Facebook pages that were clearly created in an automated fashion using malware programs. Users who click on the link are prompted to install rogue anti-spyware tools on their systems, he said.

All of the pages contain the same profile picture but with different user names. So far, AVG has noticed a "couple of hundred" Facebook pages that appear to have been created by an automated malware program. From a security threat standpoint, the Facebook break-in doesn't appear to be particularly serious, Thompson said in his blog. Simon Axten, a Facebook spokesman, said the company is investigating the report and is working on identifying the fake accounts "so we can disable them en masse." In an e-mail message, Axten said that the URL contained in the profiles has already been blacklisted by major Web browsers and have been blocked from being shared on Facebook. And Facebook is sure to deactivate all the new accounts "as quickly as they find them." Even so, the fact that hackers got past Facebook's Captchas highlights a continuing trend by attackers to try and exploit social networks, he said.

The company is using a third-party Captcha company called reCAPTCHA, which was recently acquired by Google and "is about as well-regarded a Captcha provider as there is," he said. Another possibility is that those responsible for the attack farmed out the Captchas to be solved by humans for a price. "On the education front, we encourage users not to click on strange links and to take appropriate steps if they feel their computer or Facebook account has been compromised," he said. Facebook is trying to understand how the new accounts were created, though it is possible that the sign-up process was manual. In a note posted today , the Internet Crime Complaint Center (IC3), which is a partnership between the FBI and the National White Collar Crime Center, warned about the trend. According to the IC3, fraudsters are using spam to promote phishing sites or to entice users to download an application or view a video. Fraudsters are continuing to hijack attacks on social networking sites and are using them to spread malicious software, the IC3 warned.

Users visiting such sites or clicking on the videos and photos then get infected by various pieces of malware. Adjusting Web site privacy settings, being selective about friends and what they are allowed to view and disabling options such as texting and photo sharing when they are not being used are all ways users can protect themselves on social networking sites, it said. Often the spam is disguised to appear as if it were sent from a user's 'friend.' Some attackers also plant malicious ads containing malware downloads on social network sites, the IC3 note warned Users of social networking sites need to be aware of such threats and take measures to address them, the IC3 said.

Missing dot drops Sweden off the Internet

What was essentially a typo last night resulted in the temporary disappearance from the Internet of almost a million Web sites in Sweden - every address with a .se top-level down name. Problems that affect an entire top-level zone have very wide-ranging effects as can be seen by the .se incident. … Imagine the same thing happening to the .com domain, which has over 80 million domain names." The total blackout of .se lasted for about an hour and a half, Pingdom says, although aftershocks are expected to continue. "The .SE registry used an incorrectly configured script to update the .se zone, which introduced an error to every single .se domain name," says Pingdom. "We have spoken to a number of industry insiders and what happened is that when updating the data, the script did not add a terminating '.' to the DNS records in the .se zone. According to Web monitoring company Pingdom, which happens to be based in Sweden, the disablement of an entire top-level domain "is exceptionally rare. … Usually it's a single domain name that has been incorrectly configured or the DNS servers of a single Web host having problems.

That trailing dot is necessary in the settings for DNS to understand that '.se" is the top-level domain. Thanks to well-functioning surveillance system .SE discovered the error immediately and a new file with the DNS data (zone file) was produced and distributed within one hour. … The false information that was sent out affected accessibility to all .se domains for a short time. It is a seemingly small detail, but without it, the whole DNS lookup chain broke down." Sweden's Internet Infrastructure Foundation, which administers .se, issued this statement: "The cause was an incorrect software update, which, despite our testing procedures were not detected. However, there may still be some name servers that have not changed out of misinformation against the real." A spokesperson for .se, Maria Eklund told a Swedish press outlet that the issues may not be completely resolved before Wednesday. "This little mistake is going to affect Internet traffic for two days," she told the newspaper. (Speculation that it's really the fault of newly "internationalized" ICANN begins in 3 … 2 … 1.)

Microsoft defends its anti-malware software after Symantec piles on

Microsoft is defending the merits of its free Security Essentials anti-malware software after a top Symantec engineer badmouthed the new release. "Microsoft Security Essentials provides real-time protection that uses behavior monitoring and reputation services to help identify the malicious software as soon as it emerges in the ecosystem and then uses the Dynamic Signature Service to make the newest definitions available virtually real-time, without having to wait for the next signature download," Microsoft said in a statement. 11 security companies to watch Earlier in the week, Jens Meggers, vice president of engineering for Norton products, claimed the newly released Security Essentials is just an unimpressive recycling of Microsoft's discontinued Live OneCare technology for Windows desktops. "It's just stripped down OneCare," Meggers said, citing a report from Dennis Technology Lab that compared Norton AntiVirus 2009 to Microsoft Security Essentials and deemed Norton stronger in malware defense by about a 2-to-1 margin (the test was sponsored by Symantec). Microsoft expressed disappointment in Symantec's claims but did not rebut each of Meggers' remarks. In its statement Microsoft said it "continues to advocate for a defense in depth strategy that includes the use of anti-malware software, but also includes protections such as firewall and user account controls like those found in Windows, browser security like that in IE8 and continuous updates like those provided through Microsoft Update." Microsoft indicated it is offering Microsoft Security essentials for free because "we still see far too many consumers worldwide that do not have up-to-date protection either because they cannot afford it, are concerned about the impact the suites will have on the performance of their PCs, or because they simply do not realize their AV software is not up to date." Offering its software for free, said Microsoft, "will remove some of the barriers in the way of consumers having quality anti-malware protection today."

Music industry signs online distribution agreement with EU

The European Commission has signed an agreement with the online music industry designed to improve consumers' access to online music across the 27-nation European Union, it said Tuesday. The agreement they reached sets out general principles that will underpin the online distribution of music in the future, leading to "improved online music opportunities for European consumers," the participants said in a joint statement. "European consumers want and deserve better online music offerings," Kroes said in a statement, describing the agreement as evidence of "real progress in this direction." This is the first time players involved in the distribution of music have agreed on "a common roadmap," she said. Online music retailers including Amazon.com and Apple, Finnish mobile phone giant Nokia, royalty rights collecting societies, consumer groups and the record labels EMI and Universal Music Group struck the deal with E.U. Commissioner for competition Neelie Kroes. Apple is optimistic that over the coming year it will be able to make its iTunes online music store available in countries where it doesn't operate at present, the Commission said.

The biggest obstacle to creating a fully functioning online marketplace for music until now has been the reluctance of collecting societies to do away with their traditional approach to the European market, which involved each one maintaining a monopoly over rights collection in its national territory. Meanwhile, EMI expects to sign non-exclusive digital licensing agreements with two of the most obstinate collecting societies in Europe - SACEM of France and Spain's SGAE, the Commission said. The Internet's ability to reach across borders makes it harder for online stores to restrict sales to customers in a particular territory.

CA to buy NetQoS for $200 million

CA Monday announced plans to acquire NetQoS for $200 million, adding application-aware network and systems management products to the software maker's broad enterprise IT management portfolio. The added technology will also boost CA's efforts to manage advanced infrastructures that feature virtual systems and cloud computing environments, the vendor says. "NetQoS technology complements CA's Wily products and will help network and systems engineers better design their infrastructure to ensure application issues don't occur from the start," says Roger Pilc, senior vice president and general manager of CA's infrastructure and automation business unit. "The technologies will help network and systems management be more application aware." The deal, anticipated to close in CA's fiscal third quarter, would augment an already full software lineup grown via previous acquisitions of Wily Technology, Concord Communications and Aprisma. Hottest tech M&A deals of 2009 CA executives say the pending acquisition offers little overlap by way of products and will help CA products diagnose the root cause of application errors within the network and systems infrastructure.

CA executives say NetQoS products, designed for network managers responsible in part for application delivery, will add to the company's Wily products that detect performance problems in the application environment. Customers can visualize the links and relationships between the delivery technologies and the business applications and services with Wily, and understand the real-time application and service activity across those links and relationships with NetQoS traffic flows," says Jasmine Noel, co-founder and principal analyst at Ptak, Noel & Associates. NetQoS tools are able to detect application performance problems using network-centric measures such as traffic flow. "The acquisition is good because NetQoS has a focus on application delivery, so when combined with Wily, it offers a good one-two punch. With some areas of overlap in the former Concord eHealth and Aprisma Spectrum tools, CA's Pilc say the company will work to address issues after the deal closes. NetQoS technology will target network engineers who focus on application delivery where the management of traffic flows is the primary task, rather than the management of thousands of network devices." CA also expects the NetQoS technology to play a bigger role in its virtual and cloud management offerings. Noel says customers should not expect NetQoS tools to get lost in the shuffle as CA could have targeted plans for each product suite. "In terms of portfolio, CA now has two network performance management solutions, eHealth and NetQoS. But I think CA has specific targets for both solutions," she says. "CA's eHealth technology will target network engineers who spend most of their time managing performance of specialized network infrastructure.

With its ability to track flows across virtual and physical elements, NetQoS tools could be coupled with Cassatt assets CA acquired earlier this year, the company says. NetQoS co-founder and CEO Joel Trammel says CA represented the best fit with his company's technology, and customers shouldn't expect any change in products or support as the deal unfolds. With no previous partnerships, the two vendors share some 200 customers and CA's Pilc foresees "very little modification in the NetQoS product set and its approach to customers going forward." That is why NetQoS executives found the deal to be synergistic. NetQoS has more than 1,000 customers worldwide and reported revenue of $56 million in 2008. "We sought out CA because we saw a clear fit with us and the company's success in acquiring Wily, Concord and Aprisma. Industry watchers expect the deal could benefit both parties going forward if CA sales teams focus on the NetQoS suite. "For a small vendor, being acquired could be good because a larger sales force means a bigger pipeline.

We were excited and see the clear fit between tying these acquisitions together," Trammel says. Or it could be bad if it gets lost in the portfolio. Do you Tweet? In the Swainson era, CA has handled its acquisitions fairly well, and with Wily as a tag-team partner I don't see NetQoS getting lost," Noel says. Follow Denise Dubie on Twitter here.  

Intel: Chips in brains will control computers by 2020

By the year 2020, you won't need a keyboard and mouse to control your computer, say Intel Corp. researchers. Scientists at Intel's research lab in Pittsburgh are working to find ways to read and harness human brain waves so they can be used to operate computers, television sets and cell phones. Instead, users will open documents and surf the Web using nothing more than their brain waves.

The brain waves would be harnessed with Intel-developed sensors implanted in people's brains. Researchers expect that consumers will want the freedom they will gain by using the implant. "I think human beings are remarkable adaptive," said Andrew Chien, vice president of research and director of future technologies research at Intel Labs. "If you told people 20 years ago that they would be carrying computers all the time, they would have said, 'I don't want that. The scientists say the plan is not a scene from a sci-fi movie - Big Brother won't be planting chips in your brain against your will. I don't need that.' Now you can't get them to stop [carrying devices]. There are a lot of things that have to be done first but I think [implanting chips into human brains] is well within the scope of possibility." Intel research scientist Dean Pomerleau told Computerworld that users will soon tire of depending on a computer interface, and having to fish a device out of their pocket or bag to access it. Instead, they'll simply manipulate their various devices with their brains. "We're trying to prove you can do interesting things with brain waves," said Pomerleau. "Eventually people may be willing to be more committed ... to brain implants. He also predicted that users will tire of having to manipulate an interface with their fingers.

Imagine being able to surf the Web with the power of your thoughts." To get to that point Pomerleau and his research teammates from Intel, Carnegie Mellon University and the University of Pittsburgh, are currently working on decoding human brain activity. People tend to show the same brain patterns for similar thoughts, he added. Pomerleau said the team has used Functional Magnetic Resonance Imaging (FMRI) machines to determine that blood flow changes in specific areas of the brain based on what word or image someone is thinking of. For instance, if two people think of the image of a bear or hear the word bear or even hear a bear growl, a neuroimage would show similar brain activity. Pomerleau said researchers are close to gaining the ability to build brain sensing technology into a head set that culd be used to manipulate a computer. Basically, there are standard patterns that show up in the brain for different words or images.

The next step is development of a tiny, far less cumbersome sensor that could be implanted inside the brain. Almost two years ago, scientists in the U.S. and Japan announced that a monkey's brain was used to to control a humanoid robot. Such brain research isn't limited to Intel and its university partners. Miguel Nicolelis, a professor of neurobiology at Duke University and lead researcher on the project, said that researchers were hoping its work would help paralyzed people walk again. Charles Higgins, an associate professor at the university, predicted that in 10 to 15 years people will be using "hybrid" computers running a combination of technology and living organic tissue. And a month before that, a scientist at the University of Arizona reported that he had successfully built a robot that is guided by the brain and eyes of a moth.

Today, Intel's Pomerleau said various research facilities are developing technologies to sense activity from inside the skull. "If we can get to the point where we can accurately detect specific words, you could mentally type," he added. "You could compose characters or words by thinking about letters flashing on the screen or typing whole words rather than their individual characters." Pomerleau also noted that the more scientists figure out about the brain, it will help them design better microprocessors. He said, "If we can see how the brain does it, then we could build smarter computers."

Apple introduces its first home server

Among the retooling of the more prominent iMac and MacBook lines earlier Tuesday, Apple quietly introduced its first small-scale server, a $999 box based on the Mac mini that one analyst called the only significant announcement of today. Apple installs the server edition of Mac OS X 10.6, aka Snow Leopard, which costs $499 when sold separately. "This is the one interesting thing about today's announcements," argued Ezra Gottheil, an analyst with Technology Business Research. "It's perfect for a very small business or a classroom, but it will make a sweet home server as well." Apple did not pitch the Mac mini-based server as at-home hardware - instead, it touted the new system as "perfect for any small business or group" - but Gottheil sees it as the company's first move into a potentially broader market. "This wouldn't be bad in the house," he said. "It has a bunch of USB ports, 4GB of memory, it can connect to a home wireless network, and decent if not great graphics." The $999 Mac mini is equipped with a 2.53GHz Intel Core Duo processor, an Nvidia GeForce 9400M integrated graphics chipset - the same as used in the entry-level MacBook and the three least-expensive MacBook Pro models - five USB 2.0 ports, a Gigabit Ethernet connection and single FireWire 800 port. Dubbed the "Mac mini with Snow Leopard Server," the new device is essentially a $799 Mac mini with two 500GB hard drives squeezed into the petite case, compared to the one 320GB drive in the stand-alone mini.

Apple yanked the optical drive from the Mac mini to fit the second hard drive in the case, a potential problem unless users pop for the $99 external SuperDrive. Even though Gottheil trumpeted the Mac mini as a potential rival to Windows in the home server market, he realized that the biggest audience was very-small-to-small businesses. "This is significant," he said. "This says Apple is going to put a real toe-hold in really small business. And unlike the Time Capsule, Apple's backup and wireless device, the Mac mini lacks a built-in router. It's a 'My First Server' device, a 'My First Pony,' for small offices that want to get serious about backup and hosting their own e-mail." Its other selling point, said Gottheil, is the unlimited user allowance that comes with Snow Leopard Server. "You can have as many users as you want connected to this," Gottheil said. "You can't get that from Microsoft for this price." Microsoft has been selling a version of its Windows operating system specially crafted for home servers since 2007; it's working on a Windows 7-compatible edition of Windows Home Server, but has delayed the release of that upgrade, citing the need for additional testing . The Windows 7-ready upgrade is now set to ship before the end of the year. The Mac mini with Snow Leopard Server is available now on Apple's online store, and will ship within 24 hours of ordering.

Windows Home Server supports a maximum of 10 users.

Google Voice Frees Your Voicemail, and Your Number

Until yesterday, signing up for a Google Voice account required you to pick a new phone number - not a pleasant option for those who have kept the same digits for years. When you sign up for Google Voice - which is still not widely available to the public (you need to get an invite or request one) - you can either choose Google one-stop phone number or keep your own for a more pared-down experience. Now Google has enabled users to keep their existing phone numbers and get (most of) the features Google Voice offers, including Google's excellent voicemail service.

Keeping your old digits gives you: Online, searchable voicemail Free automated voicemail transcription Custom voicemail greetings for different callers Email and SMS notifications Low-priced international calling Going for the full-throttle Google experience gives you all of the above plus: One number that reaches you on all your phones SMS via email Call screening Listen In Call recording Conference calling Call blocking If you already have a Google Voice number, you can add the voicemail option to any mobile phone associated with the account. Happily, Google circumvented this problem earlier this month. Some of the awesome benefits are explained in Google's YouTube explanation: Since voicemails are transcribed and placed online, even made publicly available for sharing purposes, there has been some danger of said voicemails appearing in search results. These new features are both freeing and limiting: you can keep your number but sacrifice some of the goodies that make Google Voice a powerful contender in the telephony business. Follow Brennon on Twitter: @neonmadman

Full number portability is likely coming in the future, after, of course, Google deals with AT&T, Apple, and the FCC. But some have high hopes that eventually the opposition will grow to accept and embrace Google Voice.

HP extends data center, campus Ethernet switches

HP this week unveiled enhancements to its data center and campus switching portfolio designed to increase density and tightly integrate switching with blade server systems. The extensions are intended to better align HP's data center and network products for the market and mindshare battle with Cisco. The evolution of Ethernet The products include blade switches designed to reduce cost and improve security in the data center; an 8Gbps FibreChannel Virtual Connect module and firmware upgrade intended to tune bandwidth to application requirements; and new chassis and modules for its ProCurve 5400 and 8200 Ethernet switches to provide an array of configurations depending on network need. Cisco's recently shipped Unified Computing System integrates data center servers, switching, virtualization and storage access, and are viewed as a competitive assault on HP and IBM's traditional data center turf.

The new HP blade switches include the 10G Ethernet ProCurve 6120XG and ProCurve 6120G/XG. The 6120XG sports eight 10G uplinks - one 10GBASE-CX4 Ethernet or one SFP+, five SFP+ that can be either 1Gbps or 10Gbps, and two midplane crosslinks or SFP+. The 6120XG is also Converged Enhanced Ethernet (CEE) "ready," which means it will support an upgrade to the CEE standards for the integration of Ethernet and Fibre Channel in the future. The campus LAN enhancements are viewed as lower cost alternatives to Cisco in that market as well. Those standards are expected for ratification in the first or second quarter of 2010. The 6120G/XG is designed to facilitate the transition from 1G Ethernet to 10G Ethernet. In addition to aiding in the migration to 10G, the 6120G/XG supports the attachment of legacy network equipment in the data center. It features one 10G Ethernet CX4 port for short distance data center links; two 10G Ethernet XFP ports for copper of fiber connectivity; two 1G Ethernet SFP ports; and four 1G Ethernet RJ45 ports. The 6120XG costs $11,500; the 6120G/XG costs $5,500. For the campus LAN, HP unveiled a half-size chassis of its 8212zl switch.

It features six chassis slots and the same hardware and software architecture as the 8212zl.  The 8206zl costs $12,599. New modules for both the 8200zl and 5400zl switches include a 24-port 10/100/1000 PoE+ card, $4,199; a 20-port 10/100/1000 PoE+ board with four SFP ports, $4,199; a 24-port 10/100 PoE+ card, $2,499; and a four port 10G SFP+ module, $4,199. These products are intended to support new bandwidth intensive applications such as video while reducing cost and power consumption. The 8206zl is designed for high-density LAN access, mid-size LAN core and distribution layer applications, and data center end-of-row access and aggregation. HP also unveiled a 1500 watt Power over Ethernet Plus (PoE+) power supply for the lines at a list price of $1,099. The company is also offering bundled configurations of its 5400 switches. It is backward compatible and replaces a 20-port 4Gbps Fibre Channel Virtual Connect module already offered by HP. It also costs the same as the 4Gbps module - $9,499. The Virtual Connect firmware upgrade provides dynamic bandwidth adjustment depending on application requirements, HP says. A 12-slot 5412zl with 96 Gigabit Ethernet ports - 92 ports of PoE+ and four ports of SFP - and two 1500w PoE+ power supplies costs $17,199. A 48-port 5406zl bundle - 40 ports of PoE+ gigabit ports and four SFP ports - plus one 1500w PoE+ power supply costs $8,599. The Virtual Connect 8Gbps FibreChannel module supports 20 ports of 8Gbps uplink and downlink bandwidth. It also supports twice the number of virtual LANs - 128 - per uplink set.

HP would not discuss product plans for data center core switches and 48-port and higher top-of-rack 10G switches, with or without support for FibreChannel-over-Ethernet; nor would it discuss product plans for a FibreChannel-over-Ethernet Virtual Connect module. All products are available now.

Dell-Perot Deal Spells Trouble for Tier-Two Outsourcers

The consolidating IT services market contracted a bit further on Monday with Dell's announcement that it will acquire Perot Systems for $3.9 billion. The fact that Dell paid nearly a 70 percent premium on Perot's stock price to seal the deal confirms "the value of integrating hardware and services for infrastructure management is clearly gaining momentum," says Peter Bendor-Samuel, CEO of outsourcing consultancy Everest Group, which counts both Perot and Dell among its clients. The Texas twosome can hardly match the scale of HP or IBM on the outsourcing front-Perot brings just $2.7 billion in services revenue to the table-but the matchup is clearly made in their image. It also suggests, he adds, that the size of outsourcing/hardware companies will continue to increase in importance.

But Dell, struggling as a hardware manufacturer at a time when infrastructure sales are slow, wants in on the outsourcing business, even if it takes several acquisitions to do it. "Perot's capabilities are focused on a few geographies and industries, which Dell will need to grow or complement with other acquisitions to attain greater scale to compete head-on with the likes of HP and IBM," says Bendor-Samuel. [ Related: Dell Perot Deal: Big Price Tag, Small Industry Impact and FAQ: What the Dell-Perot Merger Means for the IT Industry. ] Neither company is likely to be too worried about the competition at this point. While Perot operates in some high-interest industries-most notably healthcare and government services-its footprint remains relatively small. It's more likely that Dell-Perot will make inroads on smaller deals. "Dell and Perot Systems can exert pressure in this sector, and if played right, could see their market share increase in the midmarket in both products and services," says Stan Lepeak, managing director at outsourcing consultancy EquaTerra. India-based providers who've been attempting to ramp up their infrastructure offerings "must continue to find ways to grow and reach meaningful scale," says Bendor-Samuel. As such, it's the tier-two players that will be watching the Dell-Perot deal closely.

Meanwhile, traditional IT services players who've yet to walk down the aisle with a hardware vendor-such as ACS, CSC and Unisys-may be wondering how wise it was to stay single. "They will be asking themselves how they can grow in the infrastructure space to meet the increased threat posed by the integrated hardware and services offerings of IBM, HP, and now Dell," Bendor-Samuel says. While Dell may be eager to keep Perot clients-and their relatively healthy profit margins-existing customers should proceed with caution (See Five Steps to Take if Your Outsourcer is Sold.) Specifically, clients should assess any impact the deal has on non-Dell hardware options, Lepeak advises. As for integration issues, Dell and Perot may have an easier go of it than most. "Good cultural alignment, close physical proximity for key leaders, and the absence of an entrenched services business at Dell-together with the obvious convergence around the value of Perot as a hardware channel for Dell and Dell as a lead generator for Perot-should make integration much faster and less painful than is the norm for deals of this scale," says Mark Robinson, EquaTerra's chief operating officer. Those most worried about the Perot deal are Dell customers working with other outsourcers. "While growing the legacy Perot Systems' client base, Dell must use caution not to alienate hardware clients who are using other service providers for outsourcing services," says Lepeak.

Apple tablet won't be just an e-reader, argues analyst

Analysts split today in their take on recent reports that Apple's long-rumored tablet will stress the device's e-book capabilities, saying that the company's plan for the "iPod Touch on steroids" would depend on the price tag. It certainly will be an e-reader, that will be part of its ecosystem, but that won't be all it is." Gottheil, who six months ago touted the idea that Apple would deliver a tablet best described as an "iPod Touch on steroids," stuck to that reasoning today. "It will use the iPhone OS, or a modified version of it," Gottheil said, echoing something iLounge.com said it heard from a reliable source this week. Earlier this week, the popular gadget blog Gizmodo cited unnamed sources who claimed that Apple was in talks with several media companies, including the New York Times , to negotiate content deals for its unannounced-but-expected tablet. "[Apple isn't] just going for e-books and mags," Gizmodo's Brian Lam wrote Wednesday. "They're aiming to redefine print." Not so fast, said one analyst. "It's more than just an e-reader," said Ezra Gottheil, an analyst with Technology Business Research who follows Apple's moves. "It's an application platform, it's a game and social gaming platform. The App Store, which Apple said this week had delivered its two billionth application, is crucial to the tablet's success, said Gottheil, which means that the device will be more than a one-trick pony. "Apple will market it as 'one more thing' nested inside 'one more thing'," Gottheil said, a move possible because of the App Store's broad library. "They'll [cast] it as able to do several increasing cool things." Gottheil's reasoning relies on the $800 price he expects Apple to slap on the tablet, a price tag much too high for a media reader-only device. "I don't think Apple has any particular interest in just creating another Kindle," he said, referring to Amazon's $489 Kindle DX . "Apple enjoys skimming the top of the market by making something hot and getting a nice margin out of it." Brian Marshall, a Wall Street analyst with Broadpoint AmTech, had a much different take, largely because of his price expectations. "I think $500 is the price," said Marshall today, adding that he agreed with Gizmodo that the tablet will focus on its e-reader capabilities. "I actually think that's how they'll promote it," he added. "They'll pitch [e-books] as a big segment, but they'll also say, 'We're gonna do this in color and much better than the Kindle'." Amazon's Kindle DX features a 9.7-inch grayscale display; according to reports out of Taiwan, component suppliers building parts for the expected Apple tablet are assembling 9.6-inch color, touch-enabled screens.

Most analysts have pegged the first half of 2010 for a tablet rollout, although some have proposed that Apple will craft a two-stage introduction, as it did with the iPhone in 2007, by announcing the hardware several months in advance of availability to give developers time to create applications or tweak existing iPhone programs for the larger device.

Virtualization: Tips for avoiding server overload

As virtualization stretches deeper into the enterprise to include mission-critical and resource-intensive applications, IT executives are learning that double-digit physical-to-virtual server ratios are things of the past. But when it comes to mission-critical and resource-intensive applications, that number tends to plummet to less than 15," says Andi Mann, vice president of research at Enterprise Management Associates (EMA) in Boulder, Colo. Virtualization vendors may still be touting the potential of putting 20, 50, or even 100 VMs (virtual machines) on a single physical machine, but IT managers and industry experts say those ratios are dangerous in production environments, causing performance problems or, worse, outages. "In test and development environments, companies could put upwards of 50 virtual machines on a single physical host.

In fact, EMA conducted a study in January 2009 of 153 organizations with more than 500 end users and found that on average they were achieving 6:1 consolidation rates for applications such as ERP, CRM, e-mail and database. That's because the consolidation rate affects just about every aspect of a virtualization project: budget, capacity and executive buy-in. "If you go into these virtualization projects with a false expectation, you're going to get in trouble," Mann says. The variance between the reality and the expectations, whether it's due to vendor hype or internal ROI issues, could spell trouble for IT teams. Indeed, overestimating P-to-V ratios can result in the need for more server hardware, power consumption, heating and cooling, and rack space - all of which cost money. Not a good thing, especially in this economy," says Charles King, president and principal analyst at consultancy Pund-IT in Hayward, Calif.

Worse yet, users could be impacted by poorly performing applications. "If a company thinks they're only going to need 10 servers at the end of a virtualization project and they actually need 15, it could have a significant impact on the overall cost of the consolidation and put them in the hole financially. Key apps will fight for server space So, why the disconnect between virtualization expectations and reality? Bob Gill, managing director of server research at consultancy TheInfoPro, agrees. "Early on, people were virtualizing systems that had a less than 5% utilization rate. King says up to this point, many companies have focused on virtualizing low-end, low-use, low I/O applications such as test, development, log, file and print servers. "When it comes to edge-of-network, non-mission-critical applications that don't require high availability, you can stack dozens on a single machine," he says. These were also the applications that, if they went down for an hour, no one got upset," he says.

Once you get into applications with higher utilization rates, greater security risks and increased performance and availability demands, consolidation ratios drop off considerably. "These applications will compete for bandwidth, memory, CPU and storage," King says. That's not the case when applying virtualization to mission-critical, resource-intensive applications; virtualization vendors have been slow to explain this reality to customers, some say. Even on machines with two quad-core processors, highly transactional applications that have been virtualized will experience network bottlenecks and performance hits as they vie for the same server's pool of resources. The best place to start: a capacity analysis, says Kris Jmaeff, information security systems specialist with Interior Health, one of five health authorities in British Columbia, Canada. Start with a capacity analysis To combat the problem, IT teams have to rework their thinking and dial back everyone's expectations.

Four years ago, the data center at Interior Health was growing at a rapid rate. Before starting down that path, Jmaeff first used VMware tools to conduct an in-depth capacity analysis that monitored server hardware utilization. (Similar tools are also available from CiRBA, Hewlett-Packard, Microsoft, PlateSpin and Vizioncore, among others.) Rather than looking at his environment in a piecemeal fashion by each piece of hardware, he instead considered everything as a pool of resources. "Capacity planning should . . . focus on the resources that a server can contribute to the virtual pool," Jmaeff says. There was a lot of demand to virtualize the 500-server production environment to support a host of services, including DNS, ActiveDirectory, Web servers, FTP and many production application and database servers. Already, the team has been able to consolidate 250 servers, 50% of the server farm, onto 12 physical hosts. Jmaeff uses a combination of VMware vCenter and IBM Director to monitor each VM for "telltale signs" of ratio imbalances such as spikes in RAM and CPU usage or performance degradation. "We've definitely had to bump applications around and adjust our conversion rates according to server resource demand to create a more balanced workload," he says.

While Jmaeff's overall data center average is 20:1, hosts that hold more demanding applications either require much lower ratios or demand that he balance out resource-intensive applications. If necessary, it's easy to clone servers and quickly spread the application load, he adds. "Because we did our homework with ratios of virtual servers by examining the load on CPU and memory and evaluated physical server workloads, we've been pleasantly surprised with our ratios," he says. But we found that with heavier-used applications, it's not the RAM, it's the I/O," says CTO Shaun Retain. Continuous monitoring is key At Network Data Center Host, a Web service provider in San Clemente, Calif., the IT team quickly learned that when it comes to virtualizing mission-critical applications, you have to consider more than just RAM. "We originally thought, based on available RAM, we could have 40 small customers share a physical server. The 40:1 ratio had to be pulled back to 20:1 at the greatest, he says.

In addition, NDC Host uses homegrown monitoring tools to ensure that ratios aren't blown by a spike in a single VM's traffic. To help with that effort, the team has written a control panel that allows their customers to log in and see how their virtual machine is handling reads, writes, disk space usage and other performance-affecting activity. Pund-IT's King says companies should also conduct rigorous testing on their virtualized mission-critical applications before and after deployment. "You have to make sure that in terms of memory and network bandwidth, each application is stable at all times. Testing will also help IT teams determine which virtual workloads will co-exist best on a physical host. "You have to make sure that a physical server isn't running multiple VMs with the same workload. For instance, if you know an application is harder hit during certain times of the year, you'll want to account for that in establishing your ratios," he says. Otherwise, if they're all Web servers, they will be contending for the same resources at the same time and that will hinder your consolidation ratio," says Nelson Ruest, co-author of "Virtualization: A Beginner's Guide" and founder of the Resolutions Enterprise consultancy in Victoria, British Columbia.

More virtualization-management tips Ruest also warns IT teams not to forget the spare resources that host servers need so they can not only support their own VMs, but accept the workload from a failing host. "If you're running all your servers at 80%, you won't be able to support that necessary redundancy," he says. Instead, IT staffers should make sure that workloads are heterogeneous and well-balanced based on peak usage times and resource demands. Most organizations will find they need to dedicate at least a month to the capacity planning and testing phases to determine the appropriate P-to-V server ratios for their environment, Ruest says. Rather than relying on vendor benchmarks, get real-world examples of whathas workedand what hasn't at organizations with your same profile. "You'll have a better chance at setting realistic expectations." Gittlen is a freelance technology writer in the greater Boston area who can be reached at sgittlen@verizon.net. Finally, EMA's Mann advises IT teams to seek out peers with similar application environments at large annual meetings like VMware's VMworld conference or Citrix's Synergy, or through local user groups. "Most attendees are more than willing to share information about their environment and experiences," he says.

Grassley seeks proof of jobs from H-1B applicants

WASHINGTON - One of the U.S. Senate's leading critics of the H-1B visa program, Sen. Grassley wants IT consulting companies that hire H-1B workers at third party client sites to prove that there is work waiting for them. Charles Grassley (R-Iowa), is asking immigration officials to toughen their demands for evidence from companies hiring visa workers.

The timing of his request to the U.S. Citizen and Immigration Service (USCIS) is no accident or is Grassley's interest. In a statement accompanying the release of his letter to Mayorkas, Grassley said, that "Employers need to be held accountable so that foreign workers are not flooding the market, depressing wages, and taking jobs from qualified Americans. About a year ago, Grassley released a USCIS study that found either evidence of fraud or other violations in one-out-five H-1B visa petitions . His letter to USCIS Director Alejandro Mayorkas, released Tuesday, also comes just prior to the start of the new fiscal year, Oct. 1 and the release of 66,700 H-1B visas petitions, a number well short of the cap, applied for since April 1, the start of the annual petition process. Asking the right questions and requesting the necessary documents will go a long way in getting out the fraud in the H-1B program." Five months after USCIS completed its fraud study, federal officials arrested about a dozen people and charged with fraud. The U.S. recently expanded the case ; the company is fighting the charges in federal court.

One of the cases involved a New Jersey company, Visions System Group Inc. alleged to have set up shell offices in Grassley's home state. Grassley said in his letter that the USCIS should be asking, "companies up front for evidence that H-1B visa holders actually have a job awaiting them in the U.S.," and not end up being "benched," or unpaid until work is found. In response, a USCIS official said Mayorkas has received the letter and will respond for it. Grassley is also seeking information on the progress the USCIS has made on a number of other issues addressed in the fraud report, including job duties that differ from those described in the petition and failure to pay prevailing wages. Grassley's letter to tougher steps comes at the same time that some immigration attorneys have complained of stepped up enforcement efforts this year, especially with request for more evidence to support a petition. Richard Durbin (D-Ill), have introduced legislation that would toughen the rules on H-1B program, and impose a number of restrictions , especially on Indian firms and their ability to use large numbers of visa holders without hiring a proportional number of U.S. workers.

Grassley, along with U.S. Sen. The U.S. can issue up to 85,000 H-1B petitions under the cap, with 20,000 set aside for advance degree graduates of U.S. universities. IT employment is down generally, and with it, demand for the visa.

Nortel users should hope for best, prepare for worst

Users should chart the progress of Avaya's purchase of Nortel's enterprise assets carefully, so that they are spared any unpleasant product integration or rationalization surprises. Avaya last week also won court approval for the purchase. Product overlap, consolidation and subsequent support are the biggest issues facing Nortel enterprise customers on the heels of Avaya's $900 million purchase of that business.  Avaya last week emerged as the winning bidder for Nortel's enterprise business, beating out Siemens Enterprise Communications for the asset.

The rise and fall of Nortel Now comes the uneasy task of sifting through the product portfolio and eliminating redundancies - an ordeal that could leave Nortel - and even Avaya users - with a shortened lifespan on their investments. "Like an onion, there are lots of layers," says Nortel customer Bruce Meyer, director of network services at ProMedica Health Systems in Toledo, Ohio. "Let's see where they go from here." "There may be some surprises there," says Bob Hafner, an analyst with Gartner. "These are going to be two large companies coming together. These things never go without issues, problems or concerns." Significant overlap is expected in the IP telephony/unified communications portfolios of both companies - such as IP PBXs, handsets and call management software. It's not the easiest thing to do. Avaya is the leading revenue market-share vendor in enterprise telephony, according to Dell'Oro Group, while Nortel is No. 4. Little to no overlap will be found in routers, switches and other infrastructure products, where Nortel has a significant market share and installed base. We need a reliable infrastructure." "The biggest issue for users is, 'Show me the [product] road map,'" says Henry Dewing of Forrester Research. "They want to see hardcore product plans and how they are going to actually consolidate product lines." Avaya has pledged near term support for the Nortel enterprise products, including those serviced by Verizon, a Nortel reseller. Indeed, Meyer believes Nortel routers and switches will be less susceptible to discontinuation than the VoIP products, because Avaya has virtually no data products. "With Avaya, there's not a lot of strength in enterprise data," Meyer says. "[Avaya] will want to know that the infrastructure is good.

Verizon filed motions last week seeking assurances that Avaya would continue to support the Verizon accounts, which the carrier says include many federal law enforcement agencies.   "I'd be surprised if that issue doesn't work itself out," says IDC analyst Abner Germanow of the Verizon/Avaya scuttle. "I'd have a hard time believing they'd leave the U.S. government out to dry." Longtime users such as Meyer and Promedica would also like support assurances. To that end, Avaya kicked in $15 million for employee retention, on top of the $900 million purchase price for Nortel Enterprise Solutions. In addition to product direction, Meyer hopes the relationship his company has had with Nortel sales, service and support representatives remains intact. Nortel enterprise chief Joel Hackney said last week that Avaya could retain as much as 75% of Nortel's enterprise staff, though he would not say how many the unit employed. We're talking about lots of long-term relationships.

Published reports, however, stated that Avaya may only retain 60% or less of the Nortel enterprise workforce, a situation that troubles Meyer. "My concern is reduced staff," he says. "What are those reductions going to mean? Brand loyalty comes from post-sales support. IDC's Germanow is advising Nortel customers to accelerate any assessment or planning activities in light of the Avaya takeover. "They should figure out where their own needs lie and how to most effectively migrate," he says. "They should hold companies to their multi-vendor visions - that open means open." Meyer, for now, is holding fast and not contemplating any alternative vendor options in light of Avaya's takeover of Nortel's enterprise business. "This is still a wait-and-see scenario," he says. "How much of this will be a replay of Bay/Nortel?" he asks, referring to Nortel's 1998 acquisition of Bay Networks, which largely crippled the No. 2 player to Cisco in routers and switches. "This is going to be really interesting to watch." If those relationships change because of staffing changes, that would be a big deal." Gartner's Hafner agrees. "Customers need to pay attention to what's going on in the [merged] organization" to detect any potential distractions or turf battles or downsizings that may adversely affect them, he says.

Hacker leaks thousands of Hotmail passwords, says site

More than 10,000 usernames and passwords for Windows Live Hotmail accounts were leaked online late last week, according to a report by Neowin.net , which claimed that they were posted by an anonymous user on pastebin.com last Thursday. Neowin reported that it had seen part of the list. "Neowin has seen part of the list posted and can confirm the accounts are genuine and most appear to be based in Europe," said the site. "The list details over 10,000 accounts starting from A through to B, suggesting there could be additional lists." Hotmail usernames and passwords are often used for more than logging into Microsoft 's online e-mail service, however. The post has since been taken down.

Many people log onto a wide range of Microsoft's online properties - including the trial version of the company's Web-based Office applications , the Connect beta test site and the Skydrive online storage service - with their Hotmail passwords. Accounts with domains of @hotmail.com, @msn.com and @live.com were included in the list. It was unknown how the usernames and passwords were obtained, but Neowin speculated that they were the result of either a hack of Hotmail or a massive phishing attack that had tricked users into divulging their log-on information. Microsoft representatives in the U.S. were not immediately able to confirm Neowin's account, or answer questions, including how the usernames and passwords were acquired. Last year, a Tennessee college student was accused of breaking into former Alaska governor Sarah Palin's Yahoo Mail account in the run-up to the U.S. presidential election.

The BBC , however, reported early Monday that Microsoft U.K. is aware of the report that account information had been available on the Web, and said it's "actively investigating the situation and will take appropriate steps as rapidly as possible." If Neowin's account is accurate, the Hotmail hack or phishing attack would be one of the largest suffered by a Web-based e-mail service. Palin, the Republican vice presidential nominee at the time, lost control of her personal account when someone identified only as "rubico" reset her password after guessing answers to several security questions. Kernell's case is ongoing. David Kernell was charged with a single count of accessing a computer without authorization by a federal grand jury last October. Shortly after the Palin account hijack, Computerworld confirmed that the automated password-reset mechanisms used by Hotmail, Yahoo Mail and Google 's Gmail could be abused by anyone who knew an account's username and could answer a single security question.

Unisys service uses the cloud to manage mobile devices

Unisys is introducing a new service on Wednesday that will allow its customers to better manage, secure and support mobile devices carried around by employees, company executives said on Tuesday. CIOs are concerned about corporate data "roaming the streets," he added. Staff now expect to use their choice of devices anytime and anywhere, and this causes problems for CIOs around cost, the cost of support, and the security of applications and data, said Tony Doye, president of Unisys' Global Outsourcing and Infrastructure Services group, in a telephone interview.

The service framework for the new end-user productivity services will support Windows Mobile phones and BlackBerry devices, with support for the iPhone and other devices available in later releases. Some early-adopter customers, mainly in Central Europe, are already using the mobile-device management framework, he said. Currently organizations generally manage devices with specific technologies that only work with a specific platform, rather than with a consistent framework across a variety of devices, said Sam Gross, Unisys' vice president for global IT outsourcing solutions. The framework is managed by Unisys for customers, and the management and support of the devices is also done from the company's services delivery centers around the world, he added. Unisys is also offering access to standard office suites by subscription through a service called Virtual Office as a service from the Unisys Secure Cloud.

The new service will enable CIOs to reduce end-user costs by providing support for different devices, desktop PCs, applications and mobile data access through a mix of traditional, virtualized and secure cloud-based service delivery models, Unisys said. The Unisys Secure Cloud has technology that protects both data in mobile devices and in storage, using a combination of encryption and dispersion of data. "The model that we are delivering is server-side virtualization services, and in this situation the data never ends up on the end-point," Gross said. Unisys' Unified Communications as a Service, also delivered through Unisys Secure Cloud, offers Microsoft Exchange, Microsoft Office SharePoint Server and Microsoft Office Communicator applications in a multi-tenant environment. Unisys is also offering generic services such as the ability to destroy the image on a device if it is reported lost, he added. Besides offering these productivity applications, customers can also provide their employees with access to other applications running at the company, through the Unisys cloud, Gross said.

Indian ban on spurious mobile phones found inadequate

The Indian government has asked mobile service providers not to allow calls on their networks from mobile phones without proper International Mobile Equipment Identity (IMEI) numbers from Dec. 1, citing security reasons. The IMEI number is used by GSM (Global System for Mobile Communications) networks to identify mobile devices. The order, however, has a glaring loophole as it does not provide for the blocking of calls from phones that use "clone" IMEIs, said Pankaj Mohindroo, national president of the Indian Cellular Association (ICA), a trade body that represents mobile handset makers and other mobile technology vendors. It is used by operators to block a stolen phone from using the network.

The Sept. 3 order from India's Ministry of Communications & IT only refers to phones that have no IMEI numbers or have a sequence of 0s in place of the IMEI number, or "non-genuine" numbers that are not, in fact, IMEI numbers. Clone IMEIs are those that have been issued to registered handset vendors but have been copied on to phones of dubious origins, Mohindroo said. ICA has told the government that handsets that have clone IMEI numbers should also be banned in the interest of security, Mohindroo said. A large number of mobile phones that are sold in India are either spurious or unbranded, often sold at low prices without bills or warranty. The use of mobile phones without proper IMEI numbers is seen by the government as a threat to the country's security, as terrorists have been found to use mobile phones extensively. A large number of consumers have bought these phones because of their low prices.

In a letter to service providers in April, the Ministry of Communications & IT recognized that some of the users of phones without proper IMEIs were "genuine innocent subscribers." Using software would be a far more attractive option than to have to throw out the phones, said Sridhar T. Pai, CEO of Tonse Telecom, a firm that researches the telecom market in India. The government approved earlier this year a Genuine IMEI Implant (GII) proposal from service providers that programs genuine IMEI on mobile handsets. Pai added that he had not evaluated the software yet. Operators have delayed implementing the ban because customers are their key assets and they will not do anything that will upset these customers, Pai said. Banning of the use of phones without adequate IMEI numbers has been delayed because of lack of clarity from the government and also because of a slow response from service providers that had earlier been ordered to block calls from phones without proper IMEIs from July 1, according to analysts.

The Cellular Operators Association of India, an association of GSM mobile operators, was not available for comment, but an official said in private that its members would be able to meet the Dec. 1 deadline. Phones with fake IMEI numbers are to be detected by reference to the IMEI database of the GSM Association (GSMA). The database of the GSMA will be able to detect fake IMEIs, but will not detect phones that have clone IMEIs, unless there is also a device management program that reveals the specification of the device, Mohindroo said. The Sept. 3 government order has expanded the ban to include mobile phones that have fake IMEIs, besides phones that have no IMEIs or a string of zeros in place of the IMEI. It has ordered service providers to make provisions in their Equipment Identity Register (EIR) so that calls from phones from all three types of defaulting phones are rejected from Dec. 1 by the networks. The EIR will then have to check whether the IMEI matches with the original device to which the number was issued, he added.

China's Alibaba expects India joint venture this year

Top Chinese e-commerce site Alibaba.com aims to announce an Indian joint venture this year as the company expands its global footprint, it said Friday. A deal in India, where Alibaba.com recently surpassed 1 million registered members, would be the latest in the site's efforts to grow abroad. "I've got a lot of confidence in India," said Jack Ma, CEO of Alibaba Group, the parent company of Alibaba.com. Alibaba.com is in talks with an Indian reseller about forming a joint venture, CEO David Wei told reporters at a briefing.

Alibaba.com is a platform for small and medium businesses to trade everything from lumber and clothes to iPods and PC components. Alibaba.com already works with Indian publishing company Infomedia 18, its likely joint venture partner, to promote its platform in the country. Its main member base is in China, but the site also has 9.5 million registered users in other countries and facilitates many cross-border trades. The site also has a joint venture in Japan and recently launched a major U.S. advertising campaign to attract more users there. Ma said Alibaba knows it needs to "do something" in Latin America as well. Ma and other top Alibaba executives visited the U.S. early this year for meetings with potential partners including Amazon.com, eBay and Google.

When asked if the company would also seek to expand in Eastern Europe, Ma said, "I will be there." Alibaba will not hold a majority stake in joint ventures it forms, instead taking a share similar to the 35 percent it has in its Japan operation. "Our global strategy means partner with local people," Ma said. "We want partners and we want partners to control their business." Users place total orders of more than US$200 million each day on the Alibaba.com international platform, Wei said. About 50 percent of those orders go to Chinese exporters, he said.

Researchers slam fickle iPhone anti-fraud feature

The iPhone's new defense - meant to prevent users from reaching phishing sites - is inconsistent at best, a security researcher said today, with some users getting warnings about dangerous links, while others are allowed to blithely surf to criminal URLs. Other experts said that the fickle feature is worse than no defense at all. But according to Michael Sutton, the vice president of security research at Sunnyvale, Calif.-based Zscaler, the new protection is "clearly having issues." At first, said Sutton, the anti-phishing feature was simply not working. "It was blocking nothing," Sutton claimed after testing iPhone 3.1's new tool Wednesday against a list of known fraudulent sites. Apple quietly added an anti-fraud feature to the iPhone's Safari browser with the update to iPhone 3.1 , released Wednesday.

By Thursday, things had improved, but just barely. "Yesterday, it started blocking some sites, for some users, but it was inconsistent. Apple relies on Google 's SafeBrowsing API (application programming interface) for the underlying data used to build anti-phishing and anti-malware blocking lists for the desktop edition of its Safari browser. Some sites are being blocked, others are not." That led Sutton to believe that the feature's functionality wasn't the issue, but how Apple updates users with a "blacklist" of malicious sites. Other browser makers, including Google and Mozilla, also use SafeBrowsing. "It appears some iPhones are getting timely updates [from Apple], but others are not, or are getting different [block list] feeds," Sutton said. "I'm feeling better about the feature than I was Wednesday, but clearly Apple is still have issues. URLs that are blocked by Safari in Mac OS X open and direct users to malicious pages [on the iPhone]." Like Sutton, James reported inconsistencies in the anti-fraud feature's effectiveness. "All we've come up with is that sometimes it works and sometimes it doesn't," said James. "This is clearly more dangerous than no protection at all, because if users think they are protected, they are less careful about which links they click." The new feature is turned on by default in iPhone 3.1; the option to turn it off is in Settings/Safari/Security, and is listed as "Fraud Warning." Sutton, although willing to concede that Apple overall is improving its security track record, bemoaned the state of mobile security in general, and the iPhone's in particular. "The greater concern to me is that we're making the same mistakes in mobile that we made on the desktop," he said. "On the desktop, security has gotten slowly better, but [with mobile] we have a fresh start. With the [media] coverage of the problem, maybe they're resolving it, or trying to." On Thursday, researchers at Intego, a Mac-only antivirus vendor, echoed Sutton's findings. "This feature should warn users that they may be visiting a known malicious Web site and ask if they wish to continue," said Peter James, a spokesman for Intego who writes the company's Mac security blog . "However, we have extensively tested this feature, tossing dozens of phishing URLs at it, and it simply does not seem to work.

I would have thought we would have learned from our mistakes, but there's virtually no protection in mobile browsers." According to research conducted by NSS Labs, which was hired by Microsoft to benchmark different desktop browsers' ability to block malware-laden sites, Safari in Mac OS X and Windows blocked only one-in-five malicious sites . Internet Explorer and Firefox, meanwhile, blocked 80% and 27%, respectively. Last month, NSS Labs attributed the disparities between Firefox, Safari and Google - all which use SafeBrowsing as the basis for their blacklists, to differences in how each browser tweaked, then applied, the lists. Google's Chrome blocked a paltry 7% of the sites.

Apple: AT&T didn't ask us to reject Google Voice

In a posting Friday to its Web site, Apple provided several intriguing details about its App Store approval processes, all in response to questions from the U.S. Federal Communications Commission (FCC) about the rejection of Google Voice from the App Store.

The page, titled "Apple Answers the FCC's Questions," discusses the App Store process in general and responds specifically to questions asked of Apple by the FCC. Since the reported rejection of a Google Voice app was the motivation behind the questioning, much of the document focuses on that matter.

According to the posting, the company has-despite media reports-"not rejected the Google Voice application, and continues to study it." (In late July, a Google spokesperson told TechCrunch that "Apple did not approve the Google Voice application"-which technically doesn't say the company rejected it.)

In essence, Apple's posting suggests, the company has decided that Google Voice is a can of worms that has the potential to confuse iPhone users - and it's this concern, and not any discussions with AT&T, that has prompted the company to prevent Google Voice apps from being sold in the App store.

Google Voice "appears to alter the iPhone's distinctive user experience by replacing the iPhone's core mobile telephone functionality... Apple spent a lot of time and effort developing this distinct and innovative way to seamlessly deliver core functionality of the iPhone," the posting says. Citing Google Voice's routing of voicemails around the phone's Visual Voicemail and the transfer of an iPhone's contacts onto Google servers, Apple's statement goes on to say that such factors "present several new issues and questions to us that we are still pondering at this time.... We are continuing to study the Google Voice application and its potential impact on the iPhone user experience."

However, Apple's statement clearly declares that AT&T had nothing to do with the non-approval of Google Voice. "Apple is acting alone and has not consulted with AT&T... no contractual conditions or non-contractual understandings with AT&T have been a factor in Apple's decision-making process in this matter," the statement says.

This isn't to say that AT&T hasn't been a player in previous app rejections. Apple's posting points out that a provision in Apple's contract with AT&T obligates the company to reject apps that would allow Voice-over-IP phone calls over AT&T's network. And the company, while apparently not having a contractual obligation, has chosen to "respect... AT&T's customer Terms of Service, which, for example, prohibit an AT&T customer from using AT&T's cellular service to redirect a TV signal to an iPhone."

That's a reference to SlingPlayer Mobile for iPhone, which was rejected from the App Store until Sling Media altered the product to work only over a Wi-Fi connection, and not on AT&T's network.

Apple's posting also mentions that, on occasion, AT&T expresses "concerns regarding network efficiency and potential network congestion" associated with apps, and that Apple "takes such concerns into consideration."

In addition to the issues around Google Voice and AT&T, the document features several other interesting tidbits regarding the App Store rejection process:

  • Most rejections "are based on bugs found in the applications," and approximately 20 percent of app submissions have to go through more than one submission before they're accepted onto the store.
  • 95 percent of applications are approved within 14 days of their submission.
  • The company receives "about 8,500 new applications and updates" every week, and in total has received 200,000 submissions (both new apps and updates) since the App Store submission process began.
  • There are "more than 40 full-time trained reviewers," and each application is studied by "at least two different reviewers" so that Apple's process can be applied uniformly.
  • There are weekly meetings of an "App Store executive review board" made up of senior App Store managers, which "determines procedures and sets policy" for the review process.

In a tone similar to that taken by Apple senior vice president of worldwide product marketing Phil Schiller in public exchanges with Daring Fireball blogger John Gruber and software developer Steven Frank, Friday's Apple document acknowledges that Apple has made some mistakes while innovating with the App Store.

"We're covering new ground and doing things that had never been done before," the document says. "Many of the issues we face are difficult and new, and while we may make occasional mistakes, we try to learn from them and continually improve."

(Update: Engadget has posted images of AT&T and Google's responses to the FCC. Hat tip: Daring Fireball.)

Symantec: Why bigger is better but less is more

Symantec likes to point out how much bigger it is than competitors like McAfee, which at $1.6 billion in annual revenue is about a quarter of Symantec's size. But it also has come to recognize that bigness has its downsides, such as confusion that can stem from having too many products.

"The big change for us is that if you talked to me 6 or 9 months ago I'd have talked about literally over 100 products we have at Symantec from a security perspective, but going forward we'll just talk about four as we focus our investment," says Francis deSouza, senior vice president of Symantec's Enterprise Security Group (smaller than Symantec's $2 billion consumer security group, but close to $2 billion, while the rest of Symantec's revenue comes from more storage/archiving/information management-related business).

Those four product areas: protection suites that include endpoint security; data loss prevention; compliance and policies; and systems management (Altiris products).

7 Burning Security Questions

Symantec's deSouza says the company is simplifying its approach as customers face a more complex mix of threats, including viruses, botnets and insider threats, across a broader surface area that includes mobile devices and cloud environments. To emphasize how scary things are out there,  he pointed out that Symantec issued more antivirus signatures last year than in its 17 previous years combined and that organized crime  is behind 90% of data breaches now.

"The criminals are brazen," deSouza said. "They're not hiding which countries the threats are coming from yet."

Symantec has even identified a common anatomy of organized attacks, which largely take place via targeted emails/spam, poorly protected Web-facing infrastructure and poorly written Web-facing applications. The attackers break in, perform a discovery of networked assets, put a value on the data available and then take what they want.

"Most companies have no idea they're even under attack," said deSouza, who joined the company in 2006 when it bought IMLogic, a company he founded and led.

One reason for this shortcoming is that companies have various security systems in place that don't necessarily talk to each other well enough to give security teams a big picture view of what's going on. Symantec will be pushing security information management technology to address this, deSouza said. New on this front is the ability to feed into a SIM system from a global intelligence network, he said.

While SIM offerings have been around for years now, deSouza says there is evidence that customers are buying into the technology in a big way and noted that its 2007 Vontu acquisition has exceeded expectations. He also pointed to the financial performance of ArcSight, a publicly-traded security management specialist that saw 34% year-over-year growth for its fiscal year ended in April. 

DeSouza also singled out DLP as a growth opportunity. Though he once thought DLP might be relegated to a feature of other security products, he said it has emerged as its own entity. One reason for this is that the people who tend to deal with compliance and data leakage issues within companies tend to be separate from those handling antivirus and other more traditional security detail.

One area Symantec does not plan to attack is pure network security, even though rival McAfee does. With Cisco, Juniper and Check Point already controlling the market, deSouza says Symantec's prospects wouldn't be good. "Our corporate strategy is to be #1 in the categories we compete in," he said, noting that Symantec has chosen to limit its participation in this market segment to partnering with HP, Microsoft and others.

Follow Bob Brown on Twitter.

VMware offer tools to automate disaster recovery, application deployment

VMware is releasing two bundles of management and automation products designed for disaster recovery and the delivery of applications to users.

The bundles, announced Monday, include several previously released products built on top of VMware's virtualization software and two that are brand-new. The new products are Site Recovery Manager, designed to simplify disaster recovery on virtual machines; and Stage Manager, designed for deploying and updating applications on virtual machines.

IT departments often struggle to keep hardware and data in sync when a disaster recovery situation forces fail-over from one server to another, says Melinda Wilken, a senior director of marketing at VMware. Changes at primary sites have to be reflected on failover servers, and this can require a lot of manual work, she says. (Compare server products)

"There's a lot of moving parts involved and the upshot is most disaster recovery plans and processes really fail to meet the recovery objective," Wilken says.

VMware described three key features of Site Recovery Manager:

*Integrated management of disaster recovery plans, letting IT pros create, update and document recovery plans in the VMware VirtualCenter http://www.vmware.com/products/vi/vc/ management interface.

*Automated tests of disaster recovery plans in an "isolated testing environment."

*Automated failover and recovery in the event of an actual disaster.

Site Recovery Manager is part of the VMware Management and Automation Bundle, which includes the new Stage Manager software and previously released products Lifecycle Manager and Lab Manager.

A second software package called the IT Service Delivery Bundle is identical to the Management and Automation package, except it is cheaper and does not include the Site Recovery Manager.

Stage Manager targets virtual server sprawl, a common problem in which virtual machines spread through an enterprise with IT managers exhibiting little oversight or control. Administrators don't want to make changes directly in production environments, so they create "shadow instances" for testing new applications or patching and updating existing ones, Wilken says. The shadow application instances end up being out of sync with those in production, she notes.

"As IT managers roll out or update applications, instead of having to keep track of multiple instances of configurations throughout these stages, IT managers using VMware Stage Manager can automate the process so changes and updates are efficiently propagated," VMware states in a press release.

This reduces risk and eliminates errors, the release adds. Essentially, the product lets IT update an application using an exact replica of the one in production, and then transfers the updated software to a production server when it's ready, Wilken notes.VMware Lifecycle Manager, released on March 31, provides an automated system for requesting, approving, deploying, updating and retiring virtual machines. Lab Manager, which has been available since December 2006, gives users quick access to virtual machines without sacrificing IT control.

The VMware IT Service Delivery Bundle can be ordered from distributors and resellers beginning May 19 for $2,995 for every two processors. The Management and Automation Bundle, available the same day, will cost $3,995 per two processors. All products within the bundles can be purchased separately.

SharePoint search bolstered with MetaVis tools

MetaVis Technologies Monday released a set of tools to classify and organize data to help users improve the search capabilities in Microsoft's SharePoint Server. 

The company released MetaVis Architect for SharePoint and Classifier for SharePoint, which are designed for information architects and end users, respectively, to help them categorize and tag data. Users moving data from established ECM systems to SharePoint often find the Microsoft platform only offers a subset of tools and capabilities for organizing data for effective search. MetaVis is attempting to plug those gaps. In addition, data classification and tagging can be set centrally and used across SharePoint sites instead of being done per site, which is the procedure when using native SharePoint tools.

"People that had invested good money in ECM systems had organized content so it would be searchable," says Steven Pogrebivsky, CEO of MetaVis. "When they got to SharePoint there was a loss on what to do."

MetaVis Architect, which is designed for consultants and information architects, provides a visual representation of a SharePoint environment that lets users edit the diagram and set synchronization among SharePoint sites. Standardize data classifications can be used between sites and with collections of sites. Users can design data classifications, called taxonomies, offline and then deploy them to SharePoint. Architect also allows replication or consolidation of SharePoint's columns, content types and libraries. The tool also compares existing taxonomies between sites and site collections to ensure consistency.

The Classifier tool, which is designed for any end-user, lets users import and tag data from SharePoint or other repositories; move and copy content between folders, sites or servers; and perform data classifications in bulk.Trial versions of the tools are available here.  Architect is priced at $5,500 per seat or $2,750 for an annual subscription.

Classifier is $1,500 for the first seat or $750 for an annual subscription.