Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Western Digital says the smart money is still on hard drives
http://channel.hexus.net/content/item.php?item=26918
When solid state drives (SSDs) started to get cheaper per GB, many of us wondered what the hell incumbent hard disk drive (HDD) leaders such as Seagate and Western Digital (WD) were doing allowing the likes of Intel, Kingston and Corsair to take their market away from them.
But the wholesale move from HDD to SSD hasn't happened for the simple reason that the cost per GB of HDDs remains a lot lower and there's only so much mainstream consumers are prepared to pay for the performance and power consumption benefits of the SSD. A good case can be made for paying that extra, but it's still extra.
Of course the HDD incumbents have hedged their bets by buying into SSDs, but those remain primarily an enterprise play. All reasonably-priced PCs are still going to run on HDDs, but WD at least still sees a lot of growth potential in the external HDD market. We spoke to WD head of consumer branded products, Jim Welsh, to find out why.
"Attach rates [of external HDDs] are still very low, so there's a huge opportunity," said Welsh. "People should at least be backing-up their stuff." It seems that many end-users still aren't sold on the benefits of shelling out on an external HDD, despite them being cheaper and more capacious than ever.
For this reason Welsh was over in the UK on a channel push. Retail remains an important channel for WD because it's dependent on knowledgeable salespeople to explain the necessity of buying an external HDD to back up and store the exponentially increasing amount of data we're all accumulating. It's also a good margin-maker for retailers who make little on the sales of systems.
"Our strategy is to focus on the end-user and where they buy their products," said Welsh. "We want them to understand how they can unleash the power of storage." WD has direct relationships with its biggest retail partners in the UK - Dixons and Amazon - but also works with a number of disties.
And WD definitely has its eye on emerging sectors. We asked how worried Welsh is about the threat to WD TV - a family of media players designed to work with WD drives - from the innovations coming from Apple and Google. "Consumers have really grasped the concept of time-shift viewing, but having a big name is by no means an assurance you will do well," he said. "WD TV has outsold Apple TV in the US."
Another emerging category that WD is excited about is tablets. Not because it expects many of them to necessarily contain WD drives, but precisely because their internal storage is generally inadequate, and it anticipates demand from tablet owners to dump their media externally on a regular basis.
To conclude we asked Welsh if he's not at all concerned about the threat of SSDs. "We have that technology in-house, but we still haven't found the price-point for mass consumer adoption," he said. "They will continue to coexist in much the same way they do today."
Suppliers urged to embed security technology in consumer services
http://www.computerweekly.com/Articles/2010/10/12/243314/RSA-Europe-Suppliers-urged-to-embed-security-technology-in-consumer.htm
Security suppliers can help improve overall internet security by embedding technologies in consumer-facing services, says RSA president Art Coviello.
"For example, by embedding risk-based authentication technology in online banking services, financial institutions can block risky transactions," he told the RSA Europe 2010 conference in London.
"If an online banker's IP address suddenly appears in Russia, or funds are being transferred to a bank account in Latvia, it is likely to be a fraudulent transaction," he said.
The security industry needs to help consumers by partnering with organisations such as banks, he said, because we know consumers are not qualified to protect themselves and we assume most of their machines are infected by malware.
Continuing his theme of integrated, correlated defence systems for his keynote, Coviello said, "We need to create ecosystems of good guys," to underscore a call for greater interoperability between suppliers.
"There will always be areas we can improve, and no security or infrastructure supplier will ever be able to go it alone and do it all," he said.
Systems that can provide the same protection for information as air traffic control systems do for thousands of flights daily, he said, is if they are able to take feeds from a wide variety of information consoles from different suppliers.
Asked about the significance of the Stuxnet worm, which targets critical infrastructure control systems, Coviello said it proves the possibility of something security professionals have been worried about for some time.
"For a piece of malware to be able to create mayhem in the physical world is disturbing," he said.
Another disturbing thing about Stuxnet, he said, is that it signals a whole new level of malware sophistication to come.
Coviello reiterated the need for collaboration, and said that was why RSA was forming strategic coalitions with companies such as VMware and Cisco to give enterprises the confidence they need in cloud computing.
"It is natural for security to become part of other systems, and by enabling security in the virtualised environment, we can free up security professionals from operational roles to focus on identifying risks and ways to mitigate them," he said.
The security industry will see more consolidation as enterprises move increasingly to virtualisation and cloud computing, either through acquisitions or coalitions, Coviello said.
EMC was ahead of the curve by acquiring RSA, he said, but others have followed, such as HP's acquisition of Arcsight and Intel's acquisition of McAfee, as organisations have understood that security needs to be part of the stack.
"Coalitions, collaboration and co-operation has to be the order of the day, and RSA will continue to seek partners outside the family," he said.
Making Encryption Standard
http://www.itbusinessedge.com/cm/blogs/vizard/making-encryption-standard/;jsessionid=AE8F59E89924B7995636A0C48E0A99BB?cs=43666&decorator=print
Posted by Michael Vizard Oct 7, 2010 1:55:46 PM
Seagate estimates there are about 50,000 drives a day coming and going from data centers. The cause for concern is all the data that is exiting these data centers and where those drives eventually wind up.
Given the amount of data that can be stored on a single disk drive, that's a legitimate concern. Seagate and other hard-drive vendors have been pushing IT organizations to be more aggressive about adopting drives that come with built-in encryption. With more processing power available than ever, processing encryption is more practical, so it makes sense that this security technology should be more broadly applied.
In addition, recent breakthroughs have shown that the processing of live data can actually take place while data is still encrypted using techniques referred to as homomorphic encryption.
Seagate earlier this week announced that its Cheetah, Constellation and Savvio drives have received FIPS 140-2 certification from the U.S. government. Theresa Worth, a senior product marketing manager for Seagate, says that as the government becomes more concerned about security, she would not be surprised if the government applied a more stringent encryption mandate for disk drives on every company doing business with the federal government. In addition, many companies are mandating higher levels of security regardless of government requirements, said Worth.
When it comes to using encryption on hard drives, the biggest issues would seem to be replacing the sheer number of hard drives in use and the inertia that goes with that process. But given all the concern about security breaches, you can’t help but wonder if we need to be more aggressive about adopting encryption.
It might not make sense to put encryption everywhere just yet. But it’s more affordable and practical to use in more places than ever. And major advances in managing and processing encryption appear to coming rapidly.
When the the next major breach involving a lost hard drive inevitably happens, who will be held responsible for not taking some basic encryption precautions? When business managers and customers are looking for someone to blame, unwanted attention quickly finds its way to IT operations.
So the next time you’re installing a new set of disk drives, ask yourself what would happen to not only the company, but also the people working in the IT department, should any of those drives go missing.
Sphere: Related Content Print 0 Comments Permalink
Social Web Send Email
To ShareThis, click on a service below:
Digg del.icio.us
Email this article to a friend
X
MaynardG, nice find!
Navellier & Associates is a legitimate and well-known name.
Huh? A Center For Internet Disease Control?
http://www.conceivablytech.com/3333/products/huh-a-center-for-internet-disease-control/
Combining trusted software such as hypervisors and hardware elements such as a Trusted Platform Module (TPM) could further enable consumer devices to create robust health certificates and ensure the integrity of user information.
As we are still trying to figure out what Stuxnet, the truly first public example of a new era of superworms, we are listening to Microsoft warning that consumers, governments and enterprises may not be prepared for a wave of cyber attacks.
Microsoft’s Scott Charney, corporate VP of trustworthy computing, has come up with the idea since, in his argument, our disease prevention infrastructure that has been put in place especially by the Center For Disease Control (CDC) could work for the Internet and threats originating from cyberspace as well. Much like it is the cases with diseases in their traditional meaning, Charney describes a society that is aware of risks, a society that is educated how to avoid infection, and a society that can access advice what to do in a case of infection – from simple actions such as washing hands to systematic approaches how to get rid of a disease.
“To improve the security of the Internet, governments and industry could similarly engage in more methodical and systematic activities to improve and maintain the health of the population of devices in the computing ecosystem by promoting preventative measures, detecting infected devices, notifying affected users, enabling those users to treat devices that are infected with malware, and taking additional action to ensure that infected computers do not put other systems at risk,” Charney writes.
He especially focuses on consumers, as they do not have the layer of protection enterprises tupically have in the form of IT departments and “many consumers have no desire to become IT professionals, let alone security experts.” This circumstance will have the effect that “many consumers may be unwittingly running malware and their computers may be part of a botnet.”
“Such botnets may be used to send spam and engage in illegal activities, including launching denial of service attacks against critical infrastructures,” Charney writes. “Some of these activities create enough traffic on the network to make other egregious activity harder to detect and mitigate.
Education may not be enough anymore in today’s world. And tools that were designed to protect consumers, have been proven inadequate to battle botnets, as there will always be conbsumers who deviate from the guidance given (such as downloading files from unknown sources), Charney argues. “We need a better process of ensuring the health of the IT ecosystem. Simply put, we need to improve and maintain the health of consumer devices connected to the Internet. This will benefit not only users, but also the IT ecosystem as a whole.”
According to Charney, the “health of consumer devices” needs to be ensured by governments, the IT industry and Internet access providers before they can get “unfettered” access to the Internet.
The executive believes that such a scenario can be achieved by “bolstering efforts to identify infected devices and promoting efforts to better demonstrate device health.” From the presentation:
Bolstering efforts to identify infected devices involves analyzing and sharing data from sinkholes, network traffic, and product telemetry to identify potentially infected devices. If a device is known to be a danger to the Internet, the user should be notified and the device should be cleaned before it is allowed unfettered access to the Internet, minimizing the risk of the infected device contaminating other devices or otherwise disrupting legitimate Internet activities. In most cases, this can be done with current technology across multiple systems and platforms. In fact, at least one access provider is now attempting this approach. It is our view that approaches like this need to be broadened significantly, even globally.
Promoting efforts to better demonstrate device health can be done by granting access to resources based on the health of a device; this is similar to using Network Access Protection (NAP) in enterprise environments. To achieve this for consumer devices, four developments must occur. First, we need a mechanism for devices to demonstrate their good health (that is, a way to produce a health certificate) without rendering the systems more vulnerable, less reliable, or providing a conduit for leaking private information. Second, the mechanism that produced the health certificate must be trusted (that is, infected devices should not have a way to fake a health certificate). Combining trusted software such as hypervisors and hardware elements such as a Trusted Platform Module (TPM) could further enable consumer devices to create robust health certificates and ensure the integrity of user information.15 Third, access providers and other organizations must have a way to request health certificates and take appropriate action based upon the information provided. Finally, we will need to create supporting policies and rules to ensure the effectiveness of this model.
Charney proposes that this approach could provide a consumer with a “health certificate” for a device accessing the Internet. If there is a small “problem”, such as a missing patch or out-of-date virus signature, there may be an entity that “assists the user in addressing the security concern or directs the user to resources for remediation.” If there is a more serious problem and a user refuses to get a health certificate, Charney proposes that the user will be motivated to shape up by throttling the bandwidth of the potentially infected device. He does not believe that it is appropriate to deny a user access to the Internet as devices and services converge and a shutoff could have “damaging” consequences: “For instance, an individual might be using his or her Internet device to contact emergency services and, if emergency services were unavailable due to lack of a health inspection or certificate, social acceptance for such a protocol might rightly wane. But much like a cell phone may require a password but still allow emergency calls to be made even without that password, infected computers may still be permitted to engage in certain activities.
Business partners a growing security concern
Bill Brenner
29.09.2010
Increasingly complex business relationships are forcing companies to give outsiders greater access to internal systems. According to this year's Global Information Security Survey, this presents a security problem
http://news.idg.no/cw/art.cfm?id=5DDF6919-1A64-67EA-E4F2D585EC073698
When it comes to managing risk, companies have plenty of choices. They can outsource security controls or handle it in house. They can put all their data in the cloud or keep it in their data center. But their relationship with business partners is a lot more complicated.
That's one of the takeaways from the Eighth Annual Global Information Security Survey CSO conducted along with sister publication CIO and PriceWaterhouseCoopers. Some 12,847 business and technology executives from around the world took the survey, and many admitted they're somewhat more concerned than they were last year that their own security is threatened because the security of business partners and suppliers have been shaken by the recession.
More than three-fourths (77 percent) of respondents agreed that their partners and suppliers had been weakened by the recession, up from 67 percent a year ago.
"Companies are increasingly dependent on third parties whether they like it or not, and those partners need access to your IT infrastructure and your data," said Mark Lobel, a principal in the advisory services division of PricewaterhouseCoopers. "That's tough when times are good and scary when times are bad." Facing their own business problems, third parties need to cut costs just like you do, and they may slash security controls to do it, he says.
Josh Jewett, senior vice president and CIO for Family Dollar, says the company has taken steps to ensure business partners don't compromise its security. "We hold third parties accountable not only contractually, but also operationally," he said. "They must demonstrate they meet the same security standards we have internally."
Family Dollar's partners are also subject to periodic scrutiny by the company or an independent auditor. If their practices jeopardize the company's data or business continuity, it has the contractual right to terminate the relationship.
Similarly, James Pu, information security officer for the Los Angeles County Employees Retirement Association, who is also a certified IT auditor, borrows a tactic President Ronald Reagan used to enforce nuclear arms treaties with the former Soviet Union: Trust but verify.
"Third parties are often required to put their security procedures on paper, but there is never the follow-up to verify. We check up on them," Pu said. "We ask vendors a lot of questions and we limit what they can access. When they come in, we make sure they are escorted." What's more, business partners aren't allowed to connect any computers to Lacera's networks without proper validations and vetting, and they must abide by clear rules governing how data can be used.
If any data or applications are not relevant to a business need, partners don't get access to it. The data or application must be directly tied into whatever initiative -- such as an event -- the two sides are working on together, Pu says.
Larry Bonfante, CIO of the United States Tennis Association (USTA), feels much the same way about giving business partners access to his systems. Financial applications are locked down. Partners also can't access parts of the network where customer data is housed. Under those conditions, he feels pretty safe about sharing other parts of the network.
"There's always some concern, but we work with our partners to ensure things like encryption and password protection" are used, he says, adding that data flowing between USTA and its partners is encrypted. That way, it's indecipherable and therefore useless to a rogue outsider who tries to access it.
Ken Pfeil, CSO for a large New England-based mutual fund company, said that to ensure secure business partnerships, companies need to get security personnel involved before business leaders choose who will provide third-party services. Security experts will eye potential partners' security controls more carefully than, say, the events and marketing people who identify and pursue these partners would. Security practitioners are also more likely to insist that partners give each other a detailed tour of their security operations.
Pfeil said he is a stickler for cut-and-dried contract terms. "Security must be in the language. How will authentication be handled? How will data be handled in motion and at rest? Which side is responsible for which controls? You must answer all these questions," he says.
p&f, stuxnet is OT.
Hardware Security Provides Peace of Mind for Cloud Computer Users
http://www.technewsdaily.com/hardware-security-provides-peace-of-mind-for-cloud-computer-users-1275/
By Stuart Fox
21 September 2010 10:44 AM ET NEW YORK –
With advanced malware rendering antivirus software essentially useless, and cloud networks like Gmail putting all of your data eggs in one basket, how can you trust remote servers to keep your info safe? According to Stephen Hanna of Juniper Networks, the answer involves switching from security software to security hardware.
Speaking at the New York Institute of Technology cyber security conference last Wednesday, Hanna detailed how only dedicated security chips can provide the security, and security verification, needed to make cloud computing safe and reliable.
“With Gmail, you don’t really know where or how it’s running. You just have to trust that it’s secure. Having hardware security can give you greater confidence in that.”
Unlike security software, which runs on vulnerable multipurpose equipment, hardware security devices are designed for only one purpose. Since these security chips only run a few clearly delineated programs, there’s nowhere for malware to hide, Hanna said.
Not only do these chips protect themselves by shutting down if they detect any activity outside of their original programming, but some, like Trusted Computing Group’s TPM module, even respond to physical stimuli. If the chip senses any drastic changes in electricity flow, fluctuations in temperature or breaches of its physical casing, it erases all of its sensitive data. General purpose hard drives simply can’t match that level of security, Hana said.
“There are so many ways to get your machine infected, and when you move from software to hardware, you take care of the ability of all those viruses to get your security keys,” Hanna said. “Of course, someone could still steal your computer, take it to a lab, and crack it that way, but that’s a spy scenario that’s not likely.”
Most importantly from a trust standpoint, a user can check whether or not a cloud computer uses these security hardware devices. Each hardware security device comes with a digital certificate that is almost impossible to fake, Hanna said. By checking for that digital certificate, a user can rest assured that their data is safe, even if they can’t physically check the computer it’s on.
Thanks to those advantages, security hardware has become a focus of some of the largest computer companies in the world. Last month, chip-giant Intel bought the antivirus company McAfee, a move that signals how interested the market is in this new technology.
This is not to say that hardware security devices are impregnable. In February, computer engineer Christopher Tarnovsky successfully defeated the security on the TPM module. However, Tarnovsky needed far more time and resources to defeat the hardware than he needed to breach even the tightest software security.
This proves that while not perfect, hardware security at least improves upon software enough to inspire confidence.
"FIPS 140.2 Inside"
http://www.techshout.com/hardware/2010/20/seagates-momentus-self-encrypting-drive-acquires-fips-140-2-certification/
The FIPS 140-2 certification exemplifies Seagate’s commitment to security standards that enable the widespread adoption of encrypting hard drives for laptops and other computers as the explosive growth of laptop PCs puts more sensitive personal and business information at risk. Today’s NIST approval gives our system builder and end-user customers the peace of mind that Momentus Self-Encrypting Drives deliver the full power of government-grade security,” explained Dave Mosley, executive vice president of Sales, Marketing and Product Line Management at Seagate.
This revelation paves the way for Momentus Self-Encrypting drives to unveil across all U.S. as well as Canadian federal agencies. Besides these organizations, the devices will also deploy for various local and state governments as well as regulated industries like finance, healthcare and defense that utilize FIPS-certified products in order to safeguard sensitive data on computer networks and PCs. The FIPS has even been approved for protecting confidential information in various fields such as education, utility and transportation. FIPS-validated products are also recognized by other foreign governments besides Canada.
How I got thrown out of an NSA party
http://www.networkworld.com/news/2010/091610-nsa-party.html
For NSA, the press 'makes them nervous'
By Ellen Messmer, Network World
September 16, 2010 10:36 AM ET
Sponsored by:
ORLANDO - The National Security Agency, America's high-tech spy agency and guru for military information security, is a secretive sort of creature that doesn't like to come out in the daylight, especially to deal with the media. So inviting the tech media, such as myself, to attend the NSA's first-ever "NSA Trusted Computing Conference and Exposition" was not an easy decision.
After all, they were letting some of their more prominent and smart NSA technical personnel out of the confines of places like Ft. Meade, the NSA headquarters, to talk about how much the agency wants to make use of commercial security products and virtualization -- and influence its development so it's good enough for the Top Secret mission-critical needs of the military.
Former NSA tech chief: I don't trust the cloud
But while the NSA had apparently decided to include the press at this first-ever conference, it was a decision fraught with much hand-wringing. Which leads me to tell you how I got thrown out of an NSA party — a first for me, I might add.
Let's begin. Just to show how tortured they were about this conference and inviting the press, NSA's public affairs split the conference apart, telling press like me that we could attend one full day of the three-day conference, but the last two days were off limits except for two one-hour sessions and a couple of demos. (Yikes, I've been invited to less than half a conference!)
One vendor trying to help NSA deal with its press-phobia issues was flabbergasted, saying, "It's like they don't really want you to come."
Right, but I did anyway, with misgivings. Even after I read the NSA invitation, which reminded me that photography was prohibited and NSA would not (gasp!) pay for my food. The NSA public affairs lady — an amazingly pleasant person, by the way — even left me a voicemail in my hotel room to remind me that Wednesday and Thursday sessions were basically off limits with just two exceptions.
The anguished tolerance of the press was on display from the start. NSA's Matt Van Kirk (his title: Project Director of Technology Commercialization for Trusted Computing and the High Assurance Platform Program in the National Security Agency/Central Security Service Commercial Solution Center) kicked off the conference by declaring, "The press is here. Be mindful of that."
But it's not like the NSA is no fun at all. Van Kirk encouraged his audience to visit the vendor exhibit area and pick up special "trading cards" related to the Trusted Computing Platform (they gave me eight of them. My favorite is the picture of the "Trusted Boot Code Card" with its enormous boot and chains). He said maybe if you get enough trading cards, you get a gift prize.
So how did I get thrown out of an NSA party Tuesday evening in the courtyard of the Doubletree Hotel?
Well, the nice NSA press affairs lady had unexpectedly given me a ticket to join this NSA soiree. But as soon as I had situated myself in a corner of this outdoor gathering, where it was impossible to hear anything above the steel drums, another conference-management lady told me to leave, saying it had been decided that as press, "you make people nervous."
So I was thrown out. I went back to my hotel room and spent the evening watching one of those documentaries on the evolution of man that shows how close we as homo sapiens are to the Neanderthals. The background music of steel drums hammered on for hours.
Later I spoke with Steve Hanna, distinguished engineer with Juniper, who attended the conference to speak on the topic of Trusted Network Connect (TNC) technology from the Trusted Computing Group. I asked him whether he thought it made sense to shut media out of more than half the conference, and was it so sensitive anyway.
Hanna said as a person involved in helping the standards-development process along, he's in favor of "a maximum amount of openness. We all benefit from fresh air and sunlight."
He said he'd understand it if the press weren't invited to check out the development of the next Stealth fighter, but the Trusted Computing Group technologies and product implementations need the benefit of public participation.
The TCG technologies are mature, and although the NSA has participated in standards development, those at the NSA "are not the only or even the leading participant," Hanna said. Technologies such as TNC and TPM have gained wide following in the commercial sector, Hanna pointed out, but lag in the government and the military certainly in part because of barriers such as the "length of time to get things approved."
That was an impression I got, too, attending just one day of this all-too-secret conference.
So at the end of all this, I had to think back to well over a decade ago to the old NSA-sponsored conferences in Baltimore where the NSA came out of the shadows once a year to bring together industry, government and the private sector on behalf of public awareness about the trusted systems described in what was called the old "Orange Book" and "Red Book." It was always right around Halloween, so we took to fondly calling them the spookfests.
These old Baltimore NSA conferences, long gone, were open to the press with no restrictions. But as a new generation at the NSA takes charge over national security and tries to create a new series of trusted-computing conferences, will they forget that "sunlight is the best disinfectant," as U.S. Supreme Court Justice Louis Brandeis once put it?
NSA has become an enthusiastic proponent!!!!!
of open standards-based technologies such as Trusted Network Connect (TNC) and Trusted Platform Module (TPM)
http://news.idg.no/cw/art.cfm?id=17327EAA-1A64-6A71-CEB1D6D873DA50ED
NSA product accreditations lag behind IT security advances
http://news.idg.no/cw/art.cfm?id=17327EAA-1A64-6A71-CEB1D6D873DA50ED
Ellen Messmer
15.09.2010 kl 20:43 | Network World (US)
The National Security Agency wants to use commercially-built security products and the latest virtualization software. But the slow pace of getting products certified through NSA channels and the lightening fast pace of change in the IT industry is causing national-security heartburn.
ORLANDO -- The National Security Agency wants to use commercially-built security products and the latest virtualization software. But the slow pace of getting products certified through NSA channels and the lightening fast pace of change in the IT industry is causing national-security heartburn.
12 White Hat hackers you should know
The high-tech spy agency, which also guides Defense Department information security, has become an enthusiastic proponent of open standards-based technologies such as Trusted Network Connect (TNC) and Trusted Platform Module (TPM) put forward by the organization Trusted Computing Group (which announced it expects to propose an end-to-end security framework for cloud computing around year-end).
This week the secretive NSA held its first conference related to its views on trusted computing. The NSA Trusted Computing Conference and Exposition in Orlando drew about 500 attendees and 39 exhibiting companies.
Michael Lamont, NSA chief of the network solutions office, noted in his keynote that since May of this year the national-security strategy has been "COTS [commercial off the shelf] first, not GOTS [government]."
Lamont said the NSA wants to influence how commercial technologies are developed, and hopes "richer collaboration could further harden national-security systems" and give commercial systems some "government-like security."
Trusted computing "will be a key enabling technology or set of technologies," said Neal Ziring, technical director, information assurance directorate, NSA, in his conference keynote address.
Ziring said the NSA, under its High Assurance Platform (HAP) program, is turning to a "deliberate reliance on commercial products for protecting even national-security information," and said "my customers are demanding mobility." In the future, NSA expects "COTS will be used to protect even the most sensitive classified information."
Products developed to adhere to the specifications of the Trusted Computing Group (TCG) are a big part of the vision.
Certification processes stall adoption
The NSA's customers are the vast U.S. military and intelligence communities that require accredited software and hardware for use in sharing information from Top Secret through Secret and down to Classified and Unclassified. Products used for "Cross Domain Solutions" for instance, which provide the ability to access or transfer information between two or more security domains, have to be examined and certified to be accepted for use. But the NSA and military-supported certification processes, such as one called Common Criteria, are slow as molasses compared to the IT industry's lightening-fast innovations.
As if to underscore that point, Ian Pratt, vice president for advanced products at Citrix Systems, gave a keynote packed with heady technical detail on new virtualization software from Citrix, including the Xen-based client hypervisor and multiple ways to run virtual machines while setting policy controls through so-called "service VMs." He explained how TCG-related technologies such as TPM would work, and added that in the future Citrix may come out with a "virtual TPM" that would run as a dedicated virtual machine.
The NSA is hearing demands from the military for high-security options built on virtualization. The first desktop virtualization-designed HAP workstation built by General Dynamics was showcased in a video to show how a VMware-based and hardened Red Hat-based workstation using TNC and TCG-compliant hardware components such as TPM, as well as Intel's TXT and TVD, can support secure domain separation.
The HAP workstation, called "Trusted Virtual Environment," is said to allow for attestation, to store system measurements reliably and keep encryption keys safe. During remote attestation, network access can be denied to machines whose identity doesn't check out and compromised HAP workstations could be blocked.
But Bill Ross, director of cybermission assurance systems, C4 systems, at General Dynamics, bluntly told the NSA conference attendees that the current fast-paced and sometimes chaotic state of industry support for TCG-related technologies, along with lengthy accreditation times for HAP, is adding up to real obstacles.
"The rapidly changing hardware environment" has led to "rapid commercial product release and obsolescence," Ross said in his keynote talk about the difficulties of cobbling together various vendor products to build TAP-approved solutions such as the HAP workstation. "We're out of sync with changes in commercial technology."
"The problems are in what I'd call the techno-political realm," he added, noting that there are difficulties in convincing partners, which today include most prominently Intel, VMware, Dell, HP and others, that the effort is warranted.
"We didn't understand what motivated them," Ross pointed out. "We'll say, 'We'll pay you.'" But he admitted he was surprised to see "that rarely worked." Sometimes they'd say they wouldn't support a project because of what they called unclear "opportunity cost." The vendors want to know that their effort for TAP and TCG will lead to wider opportunities beyond just a single TAP project.
The lengthy and cumbersome certification process known as "Secret and Below Interoperability," among others, was an obstacle.
"Bottom line is, it was a lot of growing pains to navigate through the certification process," Ross said, and "it was difficult to keep the interest on multi-year periods."
Separately, Ross said it took 18 months to get the Trusted Virtual Environment TAP-certified workstation, which allows Top Secret and below communications, through the accreditation process, which was completed last year. The Trusted Virtual Environment workstation is being used by the Special Operations Command, across multiple services including the Army as well as NSA. But he said he didn't know the exact numbers because that's kept secret.
Inside initiatives
NSA, headquartered in Ft. Meade, Md., is not given to much public interaction, particularly with the media, and is clearly struggling with conflicting desires to keep its employees well hidden while also trying to greatly influence development of security technologies in the commercial sector.
NSA allowed systems engineer Boyd Fletcher as well as Fred Leong, NSA Trusted Computing Firmware Project Lead, to discuss some of their initiatives in conference presentations where press was in attendance.
Fletcher described efforts to help develop cross-domain solutions (CDS) in a virtualized environment based on Type 1 hypervisors in particular. Military data centers and in-the-field military are clamoring for virtualization options, and the benefits of virtualization are clear, he said.
The NSA still advocated that CDS run on a trusted operating system, and "maybe in the future will run on a trusted hypervisor," he said. But virtualization promises to help eliminate a lot of the manual labor associated with having administrators physically touching hardware associated with traditional CDS today.
Virtualization's remote console capability could allow for "live migration over thousands of miles, if necessary." But if that transition occurs, system management security will grow in importance, as well as looking at technologies such as network-address translation to make sure cloned CDs don't all have the same IP address, he pointed out.
But Fletcher acknowledged the accreditation process, which can take up to two years, isn't making change simple for CDS.
In addition, Fletcher is helping craft what are called "Virtualization Security Requirements" for use by developers and others, as well as a "Virtualization Security Controls Profile" aimed at analyzing security capabilities in assorted virtual machines, including hardware, which is expected to be contributed to the fourth revision of the 800-53 security requirements document published by the National Institute of Standards and Technology.
Fletcher also said his group expects to have what's called a "Virtualization Protection Profile" for hypervisor and management that would constitute "security targets" that vendors could strive for as part of Common Criteria and the National Information Assurance Partnership program which administers the Common Criteria evaluations in the U.S.
NSA's security experts also appear ready to intercede when they think there's a problem brewing. Various security researchers have shown how it's possible to compromise computers through potential zero-day attacks on the System Management Mode (SMM), which is present in most x86 processors today, Leong said.
In his presentation, Leong alluded to work by Invisible Things Lab and others, which have made the case that rootkits can be dropped by an attacker via SMM.
Leong said the NSA is preparing a mitigation called the SMI Transfer Monitor (STM) to basically replace the current SMI Handler for SMM.
This would basically "sandbox the SMM code," said Leong, noting Intel is working with NSA on it and "Dell has actually modified its BIOS to support this." Sandia National Labs is assisting in testing of STM, and "there will be some performance overhead for doing this," he said.
Even as NSA strives to influence industry development of virtualization and TCG-related technologies, the agency is grappling with how far it will go to push for a TAP mandate oriented toward national-security-related IT purchasing.
In his keynote address, Neil Kittleson, Trusted Computing Portfolio Manager at the NSA's Central Security Service Commercial Solutions Center, said "we need HAP," which has been forward in various reference implementations. The push for next year is advocacy of some kind of policy directive around HAP and technologies based on specifications from the Trusted Computing Group. He added, "Once we advocate these things, we have to deploy."
What to Expect at IDF 2010
http://www.pcmag.com/article2/0,2817,2368952,00.asp
Excerpt:
• Cloud Computing. If you had to pick the single most popular topic for IDF sessions this year, it would have to be the "cloud"—over a dozen events are slated to discuss it. From specific implementations ("Using the Cloud for Large Data Applications: Real-Time Stem Cell Tracking") to cloud theory ("Accelerating the Transformation to the Cloud," "Network Requirements for High-Performance Computing in the Cloud") to practicums ("Designing Cloud Storage Solutions," "Building Information Infrastructure for the Cloud"), it's clear that noncentralized computing will be an unavoidable topic at IDF. And, chances are, something that will become even more important over the next few years. We're not sure if we'll see anything new next week, but if not, it's only a matter of time.
Thanks Chance! e/
Interesting execerpts from Seagate!!:
http://www.echannelline.com/usa/story.cfm?item=26134
"The achievement of certification opens the door to a lot of federal business, where they couldn't purchase an encryption solution without this approval," said Joni Clark, Product Marketing Manager at Seagate. "Our channel people on the GSA list are the big winners here.
"The Opal organization set the best practices in IT security," Clark said. "Our DriveTrust technology was the original self encrypting drive out there, and will eventually be phased out for this. ISV vendors who do software encryption are the only one this affects directly, and it's transparent to everyone else. It's much easier if everyone is following one standard, which is why Seagate is also moving to Opal."
And we knew this:
Clark said that software encryption has traditionally had three obstacles: performance, price, and complexity, since integrating it is difficult. "We are hoping to get over these three obstacles, with hardware encryption," she said. "They have a 25-35 percent increase over price of a normal drive. That's not much, it's very inexpensive."
A flurry of billion-dollar deals stirs up tech industry
Some Silicon Valley giants are snapping up companies to remake themselves around emerging technologies. This year could see the most 10-figure transactions in a decade.
Tech deals
Hewlett-Packard won a bidding war with Dell for 3Par Inc., a data-storage provider in Fremont, Calif. Above, 3Par Chief Executive David Scott. (Robert Galbraith, Reuters / August 26, 2010)
Hewlett-Packard buying security software firm ArcSight for $1.5 billion Hewlett-Packard buying security software firm ArcSight for $1.5 billion
By Steve Johnson and Brandon Bailey
September 14, 2010
From Hewlett-Packard and Cisco Systems to Intel and Oracle, some of Silicon Valley's largest companies charged out of the recession with fat bankrolls and a determination to spend whatever it takes this year to reshape their businesses around emerging technologies.
If the buying binge continues, 2010 could rank among the biggest for billion-dollar deals in a decade. Tech companies already have announced six purchases worth at least $1 billion so far this year, according to the 451 Group, which has compiled such data since 2002. The previous record was seven.
Get a daily snapshot of market numbers and trends, delivered right to your mobile phone. Text BUSINESS to 52669.
Workers and consumers are increasingly using smart phones and other mobile devices instead of desktop computers. At the same time, growing numbers of businesses and others are storing their information in data centers that can be accessed over the Internet, a trend known as "cloud computing."
To cash in on those trends, tech giants are racing to snap up established corporations and catapult themselves into new markets without having to spend years trying to develop the technology themselves. In some cases, the deals have intensified competition in the industry.
"Computing has shifted," said R "Ray" Wang, a veteran tech consultant with the Altimeter Group. Noting that for many computer hardware firms, such as HP and Intel, he said, "the future means they have to do more in software. I think we're going to continue seeing this kind of consolidation."
During the depths of the recession two years ago, many firms' boards balked at buying anything. "If you were a CEO or a CFO and said, 'I've got a great deal lined up,' they said, 'Now's not the time,'" said 451 analyst Brenon Daly.
Instead, companies cut costs and stockpiled cash. They're more free-spending now, experts say, in part because the sagging stock market has made it cheaper to buy public companies and because the interest rates paid on savings are so low.
"Nobody's making any money on cash these days," said Shannon Cross, an investment analyst with Cross Research. "But if you buy something that's making money, that contributes to your business."
HP reported $14.7 billion in cash and short-term investments as of last month, while Intel said it had $12.2 billion as of July.
Another major motivation for buying other businesses is to expand into new markets.
"They want to have a full-service offering to be able to serve the really big customers" who tend to dislike dealing with multiple suppliers, said Richard Hanley, a transactions expert with KPMG.
Moreover, many tech companies remain nervous about how their products will fare in the future. So they are hedging their bets by buying other companies to acquire different products, according to Crawford Del Prete of the research firm IDC. "It's very much a symptom of the uncertain times," Del Prete said.
One example of companies desperate to branch out is Palo Alto-based HP and its rival Dell, which sparred this month over data-storage provider 3Par. HP eventually won the bidding war with a $2.4-billion offer, more than three times the price of 3Par's stock before takeover talks began.
Both computer makers coveted 3Par because they wanted to improve their data-storage product lines, said analyst Unni Narayanan of Primary Global Research. In addition, 3Par's products help data-center operators boost their storage capacity and quickly reassign workloads across multiple devices, as users' needs change — important features for increasingly popular cloud computing. The IDC research firm predicts global sales of cloud-computing products will grow to $55.5 billion in 2014 from $16.5 billion in 2009.
Mobile Internet-ready gadgets are another fast-growing market. More than 450 million people worldwide were using such devices in 2009, and that figure is expected to double by 2013, according to IDC. That is a big lure for some Silicon Valley corporations, including Intel of Santa Clara, Calif. The company's microprocessors serve as the brains of the vast majority of personal computers. But it fears being left out of smart phones, which predominately use chips from other firms.
Last month, Intel announced plans to pay $7.68 billion for software security company McAfee of Santa Clara and $1.4 billion for the wireless chip unit of Infineon Technologies of Germany. By building both companies' technology into its chips, Intel hopes to win more sales from mobile-device makers.
The desire to gain software has prompted other deals.
Cisco of San Jose, which has been expanding beyond its core networking business for several years, said this month that it would buy ExtendMedia, a start-up that makes software for managing and delivering video over the Internet. Mountain View, Calif.-based Google also has scooped up several small social-gaming companies.
And then there is Oracle of Redwood City, Calif. Already a software giant, it paid $7.4 billion for Sun Microsystems, partly to obtain its Java software programming tools. But Oracle also wanted Sun as a way of selling integrated systems that combine Sun's server and data-storage hardware with Oracle's business and database software.
Because of these corporate tectonic shifts, several Silicon Valley companies that once occupied separate business turfs are now vying for the same territory.
In deciding last year to begin selling server computers, Cisco went head to head with HP's server division, for example, while HP's $2.7-billion purchase of 3Com gave it networking products to compete with those of Cisco.
Another example is HP's $1.2-billion purchase of Palm, mainly to obtain that company's phone operating system. HP plans to use it in a variety of smart phones, tablets and other mobile devices, potentially pitting it against Apple's mobile gadgets and Google's Android operating system.
Competing has its risks. But because it's often hard to keep growing fast internally, big companies frequently can't resist the temptation to bolster their bottom lines by gobbling up other businesses, said Kaufman Bros. tech analyst Shaw Wu.
"When a company reaches that stage, there's a need to do something about it," he said, "and that's typically done through an acquisition."
Johnson and Bailey write for the San Jose Mercury News/McClatchy.
Copyright © 2010, Los Angeles Times
http://www.latimes.com/business/la-fi-tech-deals-20100914,0,5547406.story
Seagate Adds Opal, FIPS 140-2 Standards
http://www.crn.com/news/storage/227400315/seagate-adds-opal-fips-140-2-standards-to-self-encrypting-hard-drives.htm;jsessionid=eoyByP1lxmxWdkdPtd+KFA**.ecappj02
Seagate Adds Opal, FIPS 140-2 Standards To Self-Encrypting Hard Drives
By Joseph F. Kovar, CRN 9:51 AM EST Tue. Sep. 14, 2010
Seagate is expanding the capability of its self-encrypting hard drives by standardizing the encryption of new models to a new industry hard drive standard and making certain models compatible with the government's FIPS (federal information processing standard) 140-2.
Self-encrypting hard drives are those with encryption technology built into the drive's controller ASIC (application-specific integrated circuit). They are designed to improve the security of data by capturing and encrypting all the data automatically, with no need to classify the data and no impact on performance. By encrypting the data on a hard drive, the risk that data on lost or stolen PCs can be accessed by unauthorized persons is minimized.
With self-encrypting hard drives, encryption of the data is done at a much higher speed than software-based encryption, with little or no impact on performance.
Seagate started shipping enterprise-class self-encrypting hard drives in Spring of 2008, but had been shipping such drives for mobile PCs since 2006.
Going forward, all new Seagate self-encrypting hard drives will follow a new industry-wide protocol called Opal, which was developed in concert with the Trusted Computing Group, said Joni Clark, senior vice president and marketing manager for Seagate.
Seagate's original self-encrypting drives followed a Seagate proprietary protocol called DriveTrust, and other hard drive vendors who later entered the market also followed their own protocols, Clark said.
Unfortunately, for ISVs looking to take advantage of the encryption technology in the hard drives, it was necessary to include each drive manufacturers' protocols in their software.
"All manufacturers selling self-encrypted drives had their hand in developing Opal," she said. "I hope they all adopt Opal."
Seagate self-encrypting drives featuring the Opal protocol are currently sampling with storage OEMs.
Once Opal is ready, Seagate's drives will include both the Opal and the DriveTrust protocols, Clark said.
TCG launches framework for cloud computing security
http://www.telecompaper.com/news/article.aspx?cid=756591
Published: Tuesday 14 September 2010 | 16:36 CET, Telecompaper
Trusted Computing Group, which develops industry standards for hardware-based security, has launched an effort to extend trust to cloud-based computing. The effort will be led by the organisation's new Trusted Multi-Tenant Infrastructure work group. The formation and member support of the new Trusted Multi-Tenant Infrastructure work group extends hardware-based trust to all aspects of computing and enables secure computing whether local or cloud-based. TCG also has updated its IF-MAP (Metadata Access Protocol) used to enable standardised data sharing among a variety of devices and applications, including cloud security. Multi-tenant infrastructure refers to unrelated users of shared computing infrastructure, a fundamental characteristic of cloud computing.
The new work group will develop a framework for enabling trust in the cloud. Targeting vendors, providers, consumers and integrators of multi-tenant infrastructure services, the framework will assess the trustworthiness of provider systems, enable real-time assessment of compliance as part of the provisioning process. The framework will provide implementation guidance, identify and address gaps in standards to enable trust. The actual framework will consist of policies, best practices, standards and conformance criteria that will be used by product vendors, integrators and IT users to create and evaluate multi-tenant infrastructure. TCG expects to deliver the first parts of the framework in early 2011, and it will be available free of charge on the TCG website. Trusted Multi-Tenant Infrastructure work group participants include AMD, CESG (UK National Technical Authority for Information Assurance), HP, IBM, Infoblox, Juniper Networks, Microsoft, Wave Systems, and others.
IF-MAP is being used curently to support network security applications using equipment from different vendors, and is expected to be used in cloud computing to enable real-time communication among devices including network infrastructure devices and servers. It also has been used to integrate physical security devices, Scada networks and UC platforms. The updated IF-MAP specification, version 2.0, adds new features to the publish/subscribe client/server protocol, designed to make IF-MAP compatible with existing, vendor-specific approaches. The new specification separates the base protocol from the metadata definitions that standardise how different types of information are represented. The first such metadata specification, released along with version 2.0 of the IF-MAP base protocol, addresses network security, and covers elements such as user identities, devices, network addresses, threats and events. Other industry groups can use the IF-MAP framework to define and standardise metadata for other cases, including factory automation, building automation, cloud computing and smart grid.
How do you protect your virtual machines?
By Paul Mah
http://www.fiercecio.com/techwatch/story/how-do-you-protect-your-virtual-machines/2010-09-10
Security vendor BeyondTrust at VMworld last week performed a demonstration of an attack perpetuated by an insider [1] on the exposition floor. The point here was to show how it is possible to penetrate guest virtual machines (VM) and steal the contents of their file systems without leaving a trace.
Of course, BeyondTrust also happens to sell software that reduces the possibility of such meddling, so the demonstration was more than just for purely altruistic reasons. It did get me thinking though, of how the use of VMs throws a spanner into the works of traditional defenses against theft and physical intrusions perpetuated against servers.
Some will point out that exposing one's physical sever to a malicious party is unlikely to work out well at all, physical or virtual. My thinking here is that a physical server will at least have access to full disk encryption (FDE) and TPM or other hardware mechanisms to store the decryption key.
Now, it is true that VMs in the typical enterprise setup are probably deployed on SANs, which by themselves are heavily protected by various technologies against both data loss and unauthorized access. Off-site backups of VMs, however, are left in a far more vulnerable situation; the result of information falling into the wrong hands can effectively lead to a compromise of an active VM as the password file of the system is retrieved and cracked.
Perhaps my technical understanding from my system administrator days is a tad out-of-date, and new technologies have emerged to address these issues. If you are an expert in virtualization, I would love to hear from you; do drop me an email or post a comment on the FierceCIO:TechWatch comments section. - Paul Mah [2] (Twitter @paulmah [3])
Tech for Securing a Seat at the Executive Table
Date Published: August 24, 2010
http://www.trustedcomputinggroup.org/media_room/news/157
While Internet Protocol (IP)-based security systems have broken down the silos between surveillance, access control, and intrusion detection, Security (e.g. facility / physical security) largely remains isolated from the rest of the organization and its goals. Hence, the "transformative" benefits derived from being inter-connected with other systems and applications have largely gone unrealized. Some of this stems from a lack of trust in the security of other devices, systems and users that share the network, as well as the politics of working with other groups outside of the physical security department.
However, this situation can soon change by using the interface for metadata access point protocol (IF-MAP) developed by the Trusted Computing Group (TCG). While this group may be new to the physical security industry, many of the 100+ member companies will be familiar, as they include: HP, IBM, Dell, Juniper Networks, Samsung, Microsoft, Intel, Symantec and McAfee, as well as a few physical security companies such as HID and Hirsch Electronics. TCG's goal is to develop, define, and promote open, vendor-neutral, industry standards for trusted computing interfaces across multiple platforms.
The IF-MAP protocol is actually a suite of existing, easy to implement standards that enable the secure (encrypted) exchange of events (metadata) in a pre-defined format between mutually-authenticated systems and devices. This protocol, along with the vendor-neutral system architecture, referred to as the Trusted Network Connect (TNC) architecture, fosters trust between various networked systems and their respective owners. This is achieved by ensuring that each device only reports events to specific "trusted" devices (such as network security or a SCADA system) and that these devices only respond to those events through the policies that their owners, such as the physical security department, deem to be important and relevant. At a very high level of abstraction, one can think of IF-MAP as "Twitter" for networked devices.
By using IF-MAP-enabled products, physical security and facilities management can communicate certain events to other trusted groups (including marketing, manufacturing and operations) who can use this information to implement and enforce new policies that improve compliance and security and in some cases, productivity, when the information provides for greater situational awareness. For example, physical security can provide information on the physical presence of a given individual in a specific room or building, the status of a facility or campus (safe, breached, on fire, etc.) as well as seemingly more esoteric things like the status of lighting, heating, ventilation and air conditioning (HVAC) or an elevator. Conversely, other networked systems and devices can report events and status (such as a user's network activity, location through wireless LAN triangulation, unauthorized data downloads, and process control system activity) that could trigger a new physical security or facilities policy-based response.
While some of these capabilities may have been offered by various companies in the past, most have been achieved through non-standard, proprietary communications. They tend to be limited in scope and thus, not as flexible to new applications. Due to the proprietary nature, they are not interoperable with multiple vendors' products. Finally, if a single product that straddles two groups' responsibilities is used, it can create political problems as to ownership of the device and who dictates a given policy.
The beauty of IF-MAP is that it has support from more than 100+ companies and their products may already be in use by your own or your customer's organization! Furthermore, the fear over compromising the integrity of a given system and the political issues that commonly surround joint system operation, are eliminated or significantly reduced. There is a clear demarcation between respective groups' systems. Each group determines what information is shared and what conditions merit a response.
By championing the sharing of formerly disparate systems information, Security can be seen as complementary to the goals of the organization and its peer groups. Facilities and Security personnel can help their peer groups execute on their respective goals and responsibilities (and vice-versa). As a result, the use of IF-MAP-capable devices and systems within an organization enables it to implement new policies that will add value to multiple groups. Hence, the physical security group may finally achieve or enhance its "place at the table", alongside IT and other non-security-oriented functional group peers.
Robert Beliles is a security industry consultant.
Comment: Securing data-at-rest with self-encrypting drives
26 August 2010
Bret Weber, LSI
In order for data centres to guarantee the security of their most valuable asset – that is, data – they must identify critical control points where data is at its most vulnerable. One of these critical control points is when data is at rest, particularly data stored on hardware and storage devices. Bret Weber, chief architect at storage and networking provider LSI, that looks at how data centres can utilise self-encrypting hard drives to protect data-at-rest.
Securing information in the data centre is critically important. A company’s data is one of its most valuable assets, and a security plan must provide for protection of data throughout all aspects of the storage ecosystem.
Each point in the storage infrastructure provides unique threat models that must be dealt with using best-in-class methodologies. Some examples of these ‘security domains’ would be: data-in-flight, data-at-rest, authentication of devices and users, key management, and end-to-end-data integrity.
It’s critical to protect data-at-rest – data stored on a hard drive or other storage device. Eventually every hard drive in a data centre leaves the premises. It may be stolen or lost; it may be sent back to the vendor for servicing; it may be repurposed.
Most hard drives that leave the data centre are operable and readable. In fact, studies have shown that 90% of failed drives actually have some amount of readable data.
Some data centres hire professional services to dispose of decommissioned hard drives. The drives, however, are still vulnerable. If only one drive is stolen or lost, a company may be forced to pay millions of dollars in remedies for the compromised data.
Many nations have laws requiring a company to publicly disclose the loss or theft of hard drives that contain customer information. Such disclosures can be costly in terms of money, negative publicity and lost customer confidence.
Advantages of self-encrypting drives
For the data-at-rest security domain, you must consider the specific threat models that will be encountered, and then choose the best methodology to protect against those threats.
We believe the best solution for protecting data-at-rest is to use standardised self-encrypting hard drives that automatically encrypt everything written to them. This is better than using a traditional hard drive and encrypting data upstream from the drive. In this case, when the drive leaves the environment, the attacker can read ciphertext at will, and use it as a hint to crack the data encryption keys.
Self-encrypting drives prevent this method of attack, by not allowing any access to data until the drive is authenticated. Ciphertext is never exposed in a self-encrypting drive, and the only way to get at it would be through destructive methods, such as a spin stand.
Another advantage of using self-encrypting drives is that there is no performance impact.
When encrypting data-at-rest, one of the biggest issues is data classification: determining what needs to be encrypted and what doesn’t. This is especially true when the encryption methodology has performance impacts.
With the huge amounts of data that we are talking about, it is a mind-boggling task to sort through the terabytes of information. Additionally, how can you be sure that you found everything and got it encrypted? With self-encrypting drives, the drive automatically encrypts all data written to it, so that no one needs to spend valuable time deciding which information to encrypt.
Once authenticated, self-encrypting drives appear exactly the same as non-encrypting drives to the storage infrastructure. No changes are required to the applications. This is in contrast with encrypting data upstream, which can impact storage system value and operations downstream, such as data de-duplication or compression.
The issue is that encrypted data cannot be de-duplicated because the storage encryption confidentiality ciphers encryption based on location information. In other words, two identical pieces of data stored at two different logical block addresses do not encrypt the same, so they will not be de-duplicated. In the case of compression, the encryption process randomizes the data and impacts the compression ratios.
Interoperability is also a major consideration. The encryption cipher is now tied to a disk drive rather than the application, OS or storage controller. Drives with different encryption algorithms can easily be added to an existing storage array, because the encryption algorithm is transparent to the system. Drives with newer encryption technology can be combined seamlessly with older self-encrypting drives in storage systems that support encryption.
Data encryption keys on self-encrypting drives are secure because each drive holds only an encrypted version of the encryption key, and not the key itself. Hard drive manufacturers assume that an attacker could have complete knowledge of the drive’s design and construction, and the location of any sensitive data. Therefore, no clear text data are stored anywhere on the drive, and potential hackers who know the drive’s design cannot use this information to ‘crack’ the encrypted data on the drive.
In fact, with self-encrypting drives, there is no reason to ever escrow the data encryption key. This also guarantees that the data encryption key has never been compromised because it has never left the drive. Since the key has never left the drive, the drive can be securely erased by simply deleting its encrypted form of the data encryption key.
This is in sharp contrast with methodologies that encrypt data upstream, where data encryption keys must be escrowed. This brings into question assurances about whether the key has ever been compromised. If it has been compromised, and someone has the key from the escrow, the data on a traditional drive could be recovered.
Self-encrypting drive operation basics
The usage model for a self-encrypting drive is pretty straightforward. An authentication key from an outside source is required to unlock the drive for read/write operations. This authentication key will typically come from either an enterprise key management server, or a local key management system, by way of the storage controller. After authentication is completed during power-up, encryption is transparent to the storage system, which can then perform its usual functions in a normal fashion.
Self-encrypting drives are a standards-based solution, and all drive vendors are participating in the Trusted Computing Group (TCG) standard for secure drive commands, which assures interoperability. We fully expect that in the future, all drives will eventually be self-encrypting.
Secure data storage is a real-world problem for enterprises. Encryption on the hard drive, combined with robust key management and a state-of-the-art storage system to house the drives, provides superior performance, manageability and security. This is a significant leap forward to improve security and management in the world’s data centres.
Storage and Security: Ever the Twain Shall Meet?
http://www.wwpi.com/index.php?option=com_content&view=article&id=9061:storage-and-security-ever-the-twain-shall-meet&catid=99:cover-story&Itemid=2701018
Since the early days of storage-based networking, experts have sounded warnings about the need for greater cooperation between the storage and security groups within an organization. Historically there have been virtual (mostly organizational or political) walls between the two groups with security focused primarily on issues like network threat detection and prevention and end-point protection (e.g. access control, anti-viruses and -malware on systems), while storage happily existed independently in its own SAN island. While some voices sounded alarms about the potential threats, such as a bad actor wreaking havoc via SAN intrusion and tampering with centralized storage arrays, few took any special action beyond perhaps tightening password security on SAN and storage infrastructure.
Then, several years ago, data breeches became a hot news item shining the light of public scrutiny on the ramifications of stolen notebook computers and lost backup tapes. Suddenly, it seemed there were almost weekly reports in the technology trade press of new incidents of data risk through lost media or accidental exposure. Encryption became the watchword of the day, and it was widely anticipated that data-at-rest encryption would become a standard practice as enterprises sought to protect their data. At the very least, mobile media such as tape would be encrypted before being sent offsite.
While offsite tape encryption was adopted by some and came to be considered a best practice, it is far from a standard practice. Instead, the growing adoption of disk-based backup, made affordable by deduplication, caused many organizations to choose to minimize their exposure by reducing or eliminating removable media, thus side-stepping – to a degree – the media-encryption issue. Still, others have simply chosen to continue to live with the risk.
Security Exposure
However, the fact is that while mobile media – whether tape cartridges being shipped offsite or disk drives inside a laptop – represent a significant security exposure, data residing on drives within the data center can also be at risk to a number of threats both intentional and accidental. While there are data access controls regularly assigned and managed at the host operating system level, many organizations lack a coordinated strategy when it comes to storage security. An effective security strategy should address protecting data at multiple levels – host, network, and disk – and for multiple usage scenarios.
Since host protection is relatively well understood within most organizations, I won’t dwell on it here other than to mention the need to control host access to SAN and storage devices. These devices can potentially be managed both in-band (e.g., fibre channel) and out-of-band (e.g., LAN), and it is important to ensure that management access via either method is restricted to designated management devices and authorized users.
While LANs and WANs are typically subject to intense scrutiny by both network and security teams, the fibre channel SAN often receives much less attention when it comes to security. This is largely due to two factors. First, unlike the ubiquitous TCP/IP protocol of the LAN environment, enterprise SANs predominantly run over the much less familiar Fibre Channel protocol, offering a false level of comfort or what some have termed “security by obscurity”. While security experts have spent decades developing skills and best practices regarding LAN and WAN security, relatively speaking, the SAN is still the “new kid on the block”.
The second factor is organizational: SANs are typically managed by the storage team rather than the networking team. This has several ramifications, including the fact that storage administrators tend to be more focused on issues relating to performance and availability. These are areas that are of primary concern to their users, and as a result, security becomes at best a lower priority item, or at worst an impediment to addressing these higher order needs.
Addressing Storage Security
This mindset naturally extends beyond the SAN to the management of storage arrays themselves. This becomes evident in the establishment of storage service tiers where the dominant concerns again are performance and availability with minimal attention to potential differentiation of security requirements across tiers. In actuality, there can be significantly different requirements between the storage security needs of, say, a production environment versus a development sandbox. Yet in many cases, they may end up sharing common ports on a storage array. This is not necessarily by design or carelessness, but simply by lack of appropriate policy and process resulting from the fact that security considerations were not formulated into the requirements.
The first step to addressing storage security is simply to start making it a priority. There are guidelines available, including simply adopting security best practices from organizations like the National Institute of Standards and Technology (NIST) that may already be practiced within the server and network environments to the storage realm. For a more storage-focused approach to security, the Storage Networking Industry Association (SNIA) has published a Technical Proposal called Storage Security Best Current Practices that has been endorsed by the not-for-profit Trusted Computing Group (available at http://www.trustedcomputinggroup.org/resources/storage_security_best_current_practices). This paper provides a comprehensive set of guidelines for general security management of storage, as well as specific technology areas, including NAS, block-based IP storage and fibre channel, and can serve as an excellent basis for an assessment of current security capabilities.
Another important step is to begin to apply networking security monitoring and management practices to the storage network. Some network monitoring and management suites offer SAN support and certainly logging and event correlation tools that play an important role in threat identification and analysis can be applied to storage networking devices. This will likely become an even greater imperative as networking transports and protocols converge in the growing adoption of 10 Gb Ethernet for both LAN and SAN traffic in conjunction with iSCSI and FCoE.
A third area of focus is to establish a formal security plan for data at rest. Beyond tapes and laptops, another well-publicized area of data leakage has been through discarded disk drives showing up on the secondary market (think eBay). Far more likely is lurking risk of exposure and misuse of sensitive information internally either maliciously or accidentally. Addressing these issues requires both policy and process, but can be aided by technology. Encryption capabilities are broadly available at the host, network, and storage levels that can assist in implementing data-at-rest security policies. Disk drive manufacturers, including enterprise-class drive producers like Seagate and Hitachi, now offer self-encrypting drives and advanced features like the ability to quickly cryptographically erase disks and automatically lock them on removal from a system.
Clearly, there are differing orders of needs for storage security depending on organization type, and not all features and capabilities are appropriate for all situations. Industries like defense and finance have traditionally been highly security conscious and have evolved mature practices. However, in other types of organizations, there is a danger of complacency – because storage security concerns are not so obvious they have received little attention. The reality is that the world is changing and data breeches – even relatively low order ones – can have a much more significant business impact than was previously thought possible. The impact of rapidly growing technologies like virtualization and cloud simply expands the potential risk profile, and the implication for data storage security in such scenarios needs to become a higher priority.
In the past, addressing “one-off” security concerns, such as off-site tapes, in a check-list manner may have been sufficient. Today, it’s not just about avoiding an embarrassing newspaper article. It’s really about ensuring the ability to continue to operate and function as an organization. This requires a strategic approach.
Trusted Computing Group and Members to Demonstrate Trusted Computing in Action at Security Forum
Date Published: August 23, 2010
Trusted Computing Group and Members to Demonstrate Trusted Computing in Action at Security Forum; Sessions Include Discussion of Hardware- versus Software-Based Encryption at Leading Boston Healthcare Provider
Portland, Ore., Aug. 23, 2010 -- Trusted Computing Group (TCG) and member companies Juniper Networks and Wave Systems will demonstrate the trusted enterprise at Forrester's Security Forum 2010, Sept. 16-17, 2010 at the Westin Copley Place, Boston, Mass.
Boston Medical Center's Network Security Engineer Mark Mulvaney will speak at the event's Guest Executive Forum on "Encryption: Hardware or Software? Lessons from the Field" on Sept. 16, 3:00 - 3:30 p.m., based on the center's experiences with data protection.
Attendees can see Trusted Computing in action with today's widely available products in TCG's Booth #208.
Wave will demonstrate its EMBASSY® Client/Server software that provides policy-based access controls, centralized administration and proof of compliance when used with self-encrypting drives based on Trusted Computing Group's OPAL specification. The demo will include initialization, management and user authentication of a self-encrypting drive and password recovery.
Juniper Networks will demonstrate a day in the life of an enterprise enabled by TCG's Trusted Network Connect architecture, as implemented in the Juniper Networks Unified Access Control (UAC) solution for adaptive, identity-enabled network and application access control. Key elements include linking data leak prevention with network access based on the TCG IF-MAP specification; authentication and compliance to corporate policies; differentiated employee, guest worker and contractor access; and maintenance of employees' systems to ensure compliance with security policies.
About the Trusted Computing Group
The Trusted Computing Group (TCG) provides open standards that enable a safer computing environment across platforms and geographies. Benefits of Trusted Computing include protection of business-critical data and systems, secure authentication and strong protection of user identities, and the establishment of strong machine identity and network integrity. Organizations using built-in, widely available trusted hardware and applications reduce their total cost of ownership. TCG technologies also provide regulatory compliance that is based upon trustworthy hardware. More information and the organization's specifications and work groups are available at the Trusted Computing Group's website, www.trustedcomputinggroup.org.
Follow TCG on Twitter, on LinkedIn and on Facebook.
-- 30 --
New! Data Breaches: A Growing Problem for the Healthcare Community
http://www.wavesys.com/collateral/03-000272_HIPAA.pdf
Intel's Brilliant–or Bonehead–Deal
http://online.barrons.com/article/SB50001424052970204451004575433534095540758.html?mod=BOL_hpp_dc
Excerpts:
At the CTIA Wireless trade show in Las Vegas in March, Cisco Systems' (CSCO) chief technology officer, Padmasree Warrior, asserted that there will be a trillion–a trillion!–Internet-connected devices in 2013, up from 500 million in 2007. If that's anywhere close to accurate, that would mean one trillion opportunities for bad guys to do their thing, stealing credit-card information, shutting power grids, screwing up your PC and generally wrecking the world as we know it.
For now, McAfee will continue to make the kind of anti-virus software that slows the performance of laptops like the one that I'm writing this column on (I know, I know; I'm safer this way). Indeed, Intel says, McAfee will continue to operate as an independent, wholly owned subsidiary. (And why not? Both companies are based 0.7 of a mile apart in Santa Clara, Calif.). But the real magic from this deal will come when Intel figures out how to move McAfee's virus-stopping, intruder-blocking, malware-stomping expertise from software into the processor itself. And that could take a year, or maybe two.
Strange, I think most on this board think $25 sounds better than $9...
It's unnecessary dilution... eom
My idea?
Let's see how far Wave gets on its own without a premature buyout at a ridiculously low price.
Intel-McAfee deal is about more than just software
http://www.techradar.com/news/internet/intel-mcafee-deal-is-about-more-than-just-software-711442
Although it might sound like a win-win situation for all, it isn't. The Intel-endorsed "Trusted Computing" initiative didn't go down well with the Free Software Foundation, which argues it can be used to lock out devices. Add the McAfee acquisition (a la endpoint encryption) to Intel's Trusted Execution Technology and you're just inching closer towards vendor lock-in, a nightmarish situation for free and open source applications.
Read more: http://www.techradar.com/news/internet/intel-mcafee-deal-is-about-more-than-just-software-711442#ixzz0x9ueTv1V
Hardware Security Chip Trusted Platform Module Explained
(Fullmoon Note: I can't believe that after so many years of TPM-awareness elementary articles like this are still showing up.)
http://www.mobile-computing-news.co.uk/tag/tpm-1-2
By Wilson • Aug 20th, 2010 • Category: Laptops, Mobile Computer News
Intel’s acquisition of McAfee shows that security on computers is still a major concern. As such, people will turn to software solutions of all kind to protect their data. For even more protection, though, there is a long-standing hardware solution to security concerns called Trusted Platform Module (TPM for short) as well that is available on many laptops, too
What is a Trusted Platform Module?
In short, a TPM is a special chip that one can install on a computer’s motherboard so as to authenticate hardware. Stated differently, what TPM does is authenticate the computer being used to access the data thereon as opposed to authenticating the user.
The nature of this security chip ensures that information like keys, passwords and digital certificates stored within is made more secure from external software attacks and physical theft.
This effectively protects your computer from external hackers who may be using remote access to get to your data, as well as protects your data in the instance there is physical theft. So, all your passwords, keys and certificates – basically all cryptographic functions – are performed at the security chip level, and not software level, so that scalping of your data is negated.
Software functionality, too
Trusted Platform Module has software functions built atop the chip, which further enhances security on web browsers, mail applications and other online-data dependent applications.
Confusing to those who do not understand the concept, the name is also used to describe these software functions that are dependent on the security chip. So if you here of TPM in a software sense, this is what is meant by it.
Computers with TPM
TPM 1.2, the latest version of the Trusted Platform Module, ships natively on a number of laptops. Laptops with TPM 1.2 are available from a number of manufacturers, but HP has invested by far the most effort in this area.
Intel has put together an interesting white paper on TPM 1.2 and Trusted Platform Module in general if the concept interests you. It’s very technical and in-depth yet intriguing if you work in the field.
Tags for this article: cryptographic functions, security chip level, TPM, TPM 1.2, Trusted Platform Module
Intel's McAfee buy is a Buffett-like play
http://money.cnn.com/2010/08/19/technology/intel_mcafee_deal/
NEW YORK (CNNMoney.com) -- Intel's $7.7 billion purchase of security company McAfee makes plenty of financial sense, but it's a head-scratcher from a technology standpoint.
The deal seemed to come out of the blue: Intel is the world's largest chipmaker, so a security software company wouldn't seem to be a good fit for the hardware-focused vendor. Intel has previously stated that security is one of its top priorities as it tries to get its processors into every type of connected device. But existing relationships with security companies, including McAfee, appeared to many analysts to be sufficient for Intel to execute on its technology goals.
A call between Intel executives and investors Thursday morning did little to answer that question. Intel's leaders went buzzword crazy in describing the deal, saying over and over that the McAfee purchase offered "deeper collaboration and integration between hardware and software," "substantial differentiation for our products and platforms," and "enhanced security products."
But Intel CEO Paul Otellini offered up one buzz phrase that actually means something to the company right now: "Value for Intel shareholders."
Intel has $17.8 billion cash on hand, which is just sitting there, earning very little for the company's shareholders.
So what to do with that cash? Intel could increase its dividend, but there's only so much it wants to give away. It could invest in the businesses it already owns, but the company was recently downgraded by analysts because of slowing demand for personal computers -- a problem that cuts right to the core of Intel's business model.
Alternatively, Intel could buy a company that it thinks will generate income for its investors. As Berkshire Hathaway CEO Warren Buffett said about his surprise move to acquire the Burlington Northern railroad company last year, it was an opportunity to deploy "cash in a business we understood and liked for the long term."
The question is: Did Intel make the right deal?
0:00 /1:42Intel CFO: 'All-time record for us'
Many analysts expressed concern that Intel paid a 60% premium over McAfee's closing share price on Wednesday, which is expensive by most standards. But others noted that the return on Intel's investment will likely far exceed what it paid for the security company.
"Everyone's focusing on how expensive it was, but with this deal, Intel's cash flow is growing, its balance sheet stays clean, and the capital is at a very low cost," said Ken Hackel, president of CreditTrends.com and author of Security Valuation and Risk Analysis. "This gives Intel a positive spread over what it'd earn ordinarily by investing in the slowing PC market or holding onto its cash."
Hackel estimated that Intel's total cost of capital on the McAfee purchase would be about 4%, but the cash return on its invested capital would probably be around 8%.
Software in general is a much higher-margin business than hardware, and McAfee is no exception, with a gross margin near 75%. Intel's is around 55%.
So if it's going to go after a security company, Intel likely picked up the best one it could get. Symantec has a larger share of the market than McAfee, but Symantec (SYMC, Fortune 500) is a significantly larger company and would be harder for Intel to integrate. Trend Micro was another option, but its market share is slipping. Other players' offerings are mostly tailored for large corporate customers.
"McAfee was the right one to buy," said James Ragan, an analyst at Crowell, Weedon & Co. "It competes very well with Symantec, and it gives them a big company that's not too big and has a strong mix of corporate and consumer offerings."
But some analysts who praised the move from a financial standpoint criticized it from a 30,000-foot view.
Intel will have to figure out how to integrate a big software company into its hardware business -- and it will have to address the concerns of its existing clients who use products made by McAfee competitors.
"I would highly suspect that the cost of capital is one of the reasons for this acquisition," said Erik Suppiger, a senior research analyst at Signal Hill Capital. "But it's not a good business decision if it's not going to execute well."
Shares of McAfee (MFE) soared 57% on Thursday, while Intel's (INTC, Fortune 500) fell more than 3%.
Intel-McAfee Deal Makes Secure Hardware a Priority
http://www.pcmag.com/article2/0,2817,2368031,00.asp
By: Mark Hachman
08.19.20100 comments
Intel's decision to buy security software giant McAfee for $7.68 billion on Thursday will add a "third pillar" of security to Intel's product portfolio, generating the possibility of short-term product combinations and a longer, more fully-formed integration of security, analysts said.
For its part, Intel remained vague on what products the combination would produce. On a conference call, executives said that the "first fruits" of the partnership would be released in early 2011. (PCmag.com's Neil Rubenking thinks that secure hardware will be the result.)
The deal underscores a quiet shift at Intel, from what has been a traditional focus on semiconductors to what chief executive Paul Otellini referred to Thursday as a "computing" company. Although Intel's software and services division is dwarfed by the revenues from its CPU business, Intel has beefed up its solutions business, purchasing embedded OS developer Wind River in 2009, its largest purchase in software to date. Intel has also quietly purchased RAD GameTools, Havok, CILK Arts, Virtutech, and other software companies over the years.
(Intel's next-largest acquisition, a $2.2 billion deal of networking silicon vendor Level One, was completed in 1999.)
Intel has also made software a vital piece of its efforts to win in the tablet and handheld devices space, though the development of the MeeGo operating system with Nokia and the Intel AppUP application store, noted Greg Richardson, an analyst with Technology Business Research.
And Intel has already built in hardware-based support for virtualization, a technology that isolates operating systems and processes from the rest of the server, preventing any flaw or malware from infecting the rest of the system. It's possible, some said, that Intel might look to its Xeon processor for a short-term enterprise collaboration while it develops its integration strategy further.
The deal still requires the approval of the government. When the deal is closed, McAfee will become a wholly owned subsidiary of Intel, reporting to Intel's software and services group. It will be managed by Renée James, Intel senior vice president and general manager of software and services. McAfee representatives deferred comment to Intel.
With McAfee, however, Intel has signaled that it may see security as important, as, say, graphics, a capability Intel began integrating into its core PC microprocessors with the introduction of the Core i3 and Core i5 chips earlier this year. In March, for example, Intel chief technology officer Justin Rattner appealed to Otellini to emphasize security, PCmag.com exclusively reported, offering Otellini a "mandate to change the world".
"We believe that security is most effective when enabled in hardware," Otellini said Thursday.
Analysts on the conference call seemed confused as to why the two companies decided to combine themselves, rather than spin off a joint venture or even establish a long-term partnership. Intel's stock fell 3.32 percent, or 66 cents, to $18.93.
Otellini, James, and Dave DeWalt, president and chief executive of McAfee, laid out the rationale for the acquisition in a conference call Thursday morning. The two companies have worked together for roughly 18 months, James said, and the two companies "know, trust, and respect each other". But, she added, "the threats and opportunities are simply too large to tackle alone, and that's why were here today."
Multi-segment impact
It appears that the merger may affect Intel on several levels. First, Intel executives said that they remain committed to the existing suite of McAfee consumer and corporate anti-malware and security products, including McAfee Total Protection, McAfee Antivirus, McAfee Internet Security, McAfee Firewall, McAfee IPS, as well as the company's expanding product line to cover mobile devices such as smartphones. Second, McAfee will most likely be asked to secure the Wind River operating system and other embedded products, according to a blog post by Goerge Kurtz, McAfee's chief technical officer. And finally, there is the long-term integration both on Intel's core processor lineup as well as in the handheld space, which will most likely require a number of years to fully realize, analysts said.
Some analysts wondered whether the combination would allow Intel salespeople to sell security software to PC OEMs alongside its chips, a practice known as bundling. Historically, bundling has involved Intel's Centrino products, which combined a processor, chipset, and wireless chip for one lump sum. However, Intel's Otellini said that the two companies "were not looking at bundling products per se". Bundled prices that harm competition are also one of the constraints of an Intel-FTC consent agreement, and the FTC has said that it can and would levy fines for violations of that agreement.
A spokesman for the FTC said that the agency could not confirm or deny that such an investigation would take place; the FTC has 30 days after the proposed merger is filed with the SEC to ask for additional information. The Department of Justice's antitrust division can also examine the proposed merger, although the FTC's recent investigation would make it the more likely agency.
According to James, the way to think about the integration would be in terms of "enhanced security solutions that can be hardened," which would suggest a software solution with hardware support. Intel will continue to work with other security vendors, Otellini added.
The integration with Intel's existing embedded software may in fact be the easy part, at least technically.
"You may be surprised that Intel has a software group, when you commonly think of them as a hardware company," George Kurtz, chief technology officer for McAfee, wrote in a blog post on Thursday. "In fact, McAfee is a perfect fit with the Intel acquisition of Wind River, a leader in embedded and mobile software.
"McAfee's strategy of protecting the multitude of devices such as ATMs, printers, digital copiers, and cars fits with helping organizations better manage and protect the IP enabled mobile and embedded devices that run Wind River embedded and mobile software," Kurtz added. "This also dovetails nicely with McAfee's acquisition of Solidcore, a leader in dynamic whitelisting technology that already provides protection for millions of embedded devices."
Historically, however, Intel has struggled with software, according to Andrew Jaquith, a security analyst for Forrester. Jacquith noted that Intel bought XML processing developer Sarvega in 2005, and made it "irrelevant," he wrote Thursday. In 1991, Intel bought LANDesk, then sold it. "Perhaps the most troubling part of the McAfee deal is the prospect that they will mismanage their new division into irrelevance," Jaquith said.
According to Intel's James, security solutions will be extended to Intel's embedded Atom in the near future, allowing Intel to pitch the Atom as a secure processor, capable of withstanding future attacks.
Intel formed the Trusted Computing Group with partners AMD, Hewlett-Packard, IBM, and Microsoft in 2003 with the goal of developing a trusted platform module that would sit alongside chipsets; later, Intel brought some of those capabilities inside its vPro enterprise components. Intel already markets secure capabilities such as a "remote lockdown" capability to prevent a lost or stolen PC from being used.
One problem, Jaquith noted, is that those security issues may be less relevant in a world of more tightly engineered mobile operating systems, where Intel is targeting its Atom and Wind River solutions.
"PC devices, and by this I mean those running Windows, have long needed third-party security vendors to help secure the platform," Jaquith wrote. "Early versions of Windows, and even current ones, were not designed with security in mind. Even though Windows 7 is much improved compared to Windows XP, 95 or 2000, the core OS is still based on the Win32 foundation, a twenty-year-old legacy that was designed to run on "everything." Contrast that with the highly sandboxed, compartmentalized, digitally signed "apps" model of the BlackBerry OS and Apple's iOS. With these two operating systems, you don't need on-board anti-virus, or HIPS, or anything else — and if you do, it is because Apple or RIM have screwed up. Both of these vendors are taking responsibility for their platforms in totality in ways that Microsoft never did, or could have. Neither iOS or BlackBerry OS depend in any way on hardware capabilities Intel or anybody else could bring to the table, other than the root-of-trust embedded in the handset; all of the security differentiation is is in the OS. And that, frankly, is where it belongs."
Tying McAfee's security to Intel's computing platforms will require time; the integration of graphics and microprocessor has taken years to develop, even at companies like AMD and Intel that have already had both capabilities in house.
"A deep collaboration is a few years out," said Andy Bryant, Intel's former chief financial officer and current chief administrative officer, during the call. Intel's current CFO, Stacy Smith, is on sabbatical.
And to enable that long-term development required more than just a than a development agreement, one analyst noted. "That type of security integration wouldn't be possible with the two companies having something like a bull session," said Nathan Brookwood, an analyst who has followed Intel's chip business for years.
Brookwood said that the the 2006, $5.4 billion combination of AMD and ATI Technologies, which brought PC graphics into the AMD fold, would simply not have worked as a technology partnership; there would have been thorny questions on who owned what intellectual property, as well as the chip itself. But Brookwood also said that he had had conversations with Sun Microsystems engineers following Sun's $7.4 billion acquisition of Oracle last year. The acquisition allowed for close collaboration on future products, those engineers said, which will contain dedicated logic for accelerating the types of database products Oracle manufactures, according to Brookwood.
"Intel has scads of intellectual property, as we all know: lots of different areas in system design, buses, you name it – stacks of patents," Brookwood said. If Intel works with third parties, they have to be careful not to divulge features of upcoming processors and other confidential information., he said.
"There isn't an intellectual property problem if you're all part of the same team," Brookwood said. "If not, you're at arm's length."
Effect on rivals
So what does this mean for companies like ARM and AMD, Intel's rivals in the chip business?
"As devices becoming increasingly mobile, demand for security is on the rise. Any device that is able to interact with the Internet is subject to security concerns," TBR's Richardson said. "By embedding McAfee into its processors, Intel is building a stronger baseline of security. As a result, we expect Intel to tout the security as a key differentiator against traditional competitors such as Advanced Micro Devices, as well as newer challengers such as ARM, giving Intel a longer runway for growth."
Although the number of security vendors numbers in the teens, the possibility that a company like AMD may acquire one is probably remote, Brookwood said. After spinning off its money-losing foundry operation last year, the company is still flirting with profitability. AMD needs to concentrate on continuing to deliver its products on time.
"I would be shocked, shocked, if AMD said, 'Oh hey, we're buying Norton,'" Brookwood said.
AMD officials did not respond to a request for comment.
Intel Buys McAfee: Upside for Managed Service Providers?
http://www.mspmentor.net/2010/08/19/intel-buys-mcafee-upside-for-managed-services-providers/
Posted August 19th, 2010 by Joe Panettieri
Intel’s decision to acquire McAfee, announced today, has major implications for managed services providers. The good news: Both Intel and McAfee have successful MSP-centric partner programs. And it’s a safe bet those MSP commitments will continue as Intel digests McAfee. Here’s some more perspective, including the Intel-McAfee implications for Microsoft and Symantec.
In case you missed the official news, Intel is acquiring McAfee for about $7.68 billion in cold hard cash. Now, the unofficial analysis: Intel has focused many of its partner efforts on MSPs in recent months (see http://msp.intel.com). Much of the effort involves Intel’s vPro technology, which makes it easier for MSPs to proactively manage PCs from afar. Also, vPro offers key power management capabilities. True believers include Greg Donovan, CEO of Alpheon, a managed services provider that has successfully promoted vPro offerings into such vertical markets as health care.
Meanwhile, McAfee has a growing following in the managed services market thanks to the MX Logic buyout. MX Logic ranks among the best-known email filtering and spam protection systems. Designed as a SaaS solution, MX Logic successfully recruited hundreds of partners into a recurring revenue model. Key MSP-centric partners include Spam Soap, which has been growing rapidly in recent months.
Sure, McAfee’s partner program has suffered from channel conflict over the years. But more recently, Cisco veteran Alex Thurber has delivered channel credibility within the halls of McAfee. Plus, an Intel-led McAfee may remove any lingering doubts about McAfee’s channel commitment.
Doubting Microsoft, Symantec?
On a side note: I believe Intel’s buyout of McAfee suggests Intel has lost confidence in Microsoft and Symantec on the security front. In recent years, Microsoft has tried to more effectively safeguard Windows, and Symantec has struggled to balance a security and storage strategy ever since acquiring Veritas in 2005.
By acquiring McAfee, Intel essentially tells the world it doesn’t want security concerns to impede IT sales. Instead of relying on Microsoft and Symantec to solve the problem, Intel is addressing the challenge on its own.
How to roll out full disk encryption to PCs and laptops
http://computerworld.co.nz/news.nsf/printer/CC18696A7F8A52A4CC2577810059A06B
Bob Brown looks at encryption issues
By Bob Brown, Framingham | Tuesday, 17 August, 2010
Hardly a week goes by when some organization or another doesn't lose some laptops and face a litany of IT securityquestions. One that always comes up: Were the systems' disks fully encrypted?
Sometimes the answer is "Yes", but plenty of organizations have yet to make the leap to full disk encryption.
I asked Michael Kamens, information security officer at WGBH Educational Foundation, to lay out the basics of what desktop and laptop encryption entails since he's been spearheading an encryption project involving hundreds of computers at his organization.
If an IT shop is starting from scratch, what's technically involved in encrypting PCs and laptops?
It is a huge undertaking as each computer must be touched, first by pushing the agent out and second it must be configured by the user. By configuration, the service desk must show the end user how to set up a secure passphrase that will allow their computer to move past the BIOS. Additionally, the encryption process takes anywhere from four to six hours and does impact the speed of the computer, so it should be run after hours. Probably the biggest source of errors is not disabling the hard drive from going to sleep, which will stop the process from completing.
What are the benefits of desktop and laptop encryption from a compliance standpoint?
It is mandatory under MA Privacy Law 201 CMR 17 and under Payment Card Industry Data Security Standards (PCI DSS) on any computer containing Personal Identifiable Information (PII) and/or credit card data. The real benefit is that a lost laptop that "might" contain such data will be unreadable to anyone other than the company and/or owner. This provides a safeguard that eliminates the risk of violation as today most companies have difficulty knowing exactly what's stored on the computer. But the question I raise at my presentations is: Can you afford to be on the front page of your newspaper or the 6 and 11 o'clock news. The obvious answer is everyone should do it to protect privileged data from been read if (really when) a laptop is stolen.
Are there separate challenges in encrypting Macs vs Windows PCs?
There are only two companies that offer Mac encryption – PGP and Check Point and since Apple does not play nice in the sand box, the vendors cannot deliver a single sign-on solution. On a PC, once you enter your passphrase on boot up, you are automatically logged into the network. However, with a Mac, you must enter your encryption password and then you are presented the network log-in, which requires another log-in. Additionally, during your project installation phase you must ensure that every OS is compatible. One stumbling block is that only Intel-based Macs can be encrypted today, which could have an impact if you have PowerPCs that cannot be encrypted, requiring replacement or no encryption.
Is there any reason to go with third-party tools when vendors offer their own (like Microsoft's BitLocker for Windows 7)?
You must use a third-party vendor as the PC and OS vendors' offerings (Apple and Microsoft) are not geared for truly effective centralized management. Without centralized management you don't have an easy way to manage, recover lost passphrases or view all encrypted computers to see their status. We use PGP and users do forget their PGP passphrase. The centralised management console allows us to provide a 32-bit one-time unlock token that we give to the user. Since security is critical, whenever we request this token (every token is different for every computer – no universal token) we are prompted with a "pop up" informing us that all actions are tracked and audited. Just think if you don't have the ability to provide an unlock token, you'd have to format these computers and re-image.
What are the human (as opposed to technical) challenges in encrypting desktops and laptops?
You must be tough -- as in, it's my ball and my glove, so if you want to play you need to do as I say. We do not make the choice of encryption optional. If you are in a protected class, your computer is encrypted. We have IT, HR, Legal, Finance and Executives in the protected class in addition to those handling credit cards and/or intellectual property and privileged information.
Is it expensive?
Depending on number of licenses, the cost can range between $150 to $200 per user, plus the cost for vendor professional services to assist in the installation, configuration, roll-out and training the trainer. So is it expensive when compared to the cost of fines for violating privacy laws or PCI, which can run in the millions not to mention brand damage. I think it's a bargain.
Is it time consuming?
To do it right with Macs and Windows I would say two support people can do 10 to 25 machines a day as long as you have the ability to push the clients out and can dedicate resources. In our case, JAMF Software's Casper and Microsoft's System Center Configuration Manager is used to push out the agent. One area that most do not account for is the time for user training.
Any tricks or tips?
Most vendors will provide you with the ability to do a proof of concept. We used the vendor's hosted servers rather than build our own which really made it easier and faster. You must plan on who is getting encryption to get a valid number of licenses. The use of vendor professional services I consider critical in the success of your rollout -- or prepare to spend a lot of time calling support. Remember to ask your vendor if their product works with your mix of computers and then make them prove it. Finally, set up end user training to reduce the amount of support calls.
© Fairfax Media Business Group Fairfax New Zealand Limited, 2010 Privacy Policy
Trading Lower on Heavy Volume are Shares of Wave Systems on 1x Above-Average Volume (WAVX)
Written on Wed, 08/11/2010 - 11:43amBy Chip Brian
Shares of Wave Systems (NASDAQ:WAVX) are trading down 11.6% to $2.59 today on above average volume. Approximately 421,000 shares have traded hands today vs. average 30-day volume of 207,000 shares.
Spikes in volume can validate a breakout or signify a potential turning point. As such, SmarTrend will continue to monitor shares of WAVX to see if this bearish momentum will continue.
SmarTrend is bearish on shares of Wave Systems and our subscribers were alerted to sell on May 06, 2010 at $2.98. The stock has fallen 13.1% since the alert was issued.
6 Things Every Business Laptop Must Have
http://www.pcmag.com/article2/0,2817,2367637,00.asp
3. Security
For businesses, laptop security is a multi-layered affair. There is, naturally, security software. A company will standardize on either a multi-license or enterprise-level package, one they can update over the network or with a USB key (for smaller firms). However, businesses also need more security, such as fingerprint or smartcard readers married to a Trusted Platform Module, which is, essentially, motherboard-based security. The level of OS will also dictate some of the security options. If a business goes with a standard OS, it may lack built-in encryption. Windows 7 Enterprise and Ultimate can encrypt an entire hard drive with a feature called BitLocker (this will only work if a laptop already has TPM). Why is that important? Far too many business laptops are lost in the backs of cabs, on planes, and in airport security lines. If the finder (or thief) pulls the hard drive from the system, it will be impossible to read. Even inside the system, the encrypted hard drive's data should be impenetrable. No business wants the finder to start perusing company secrets. An encrypted drive is, at least, one level of protection.
Yep, still taxable.....
CT Yankee, re: your RMD
So, you're over the age of 70 1/2? Instead of cash, you can satisfy your RMD with a distribution of Wave stock from your IRA. Your broker will require you to sign a distribution form which will ask if you're distributing cash or stock. Determine how many shares at the current price you'll need to move to satisfy the RMD and move them into a non-retirement account.
Hope this helps.
FM
Webcast numbers:
TELEPHONE: via (212) 231-2905 or (415) 226-5355.
WEBCAST/REPLAY: available at http://www.wave.com/news/webcasts and
archived for 30 days.