Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Coming soon: Full-disk Encryption For All !!!
http://www.zdnetasia.com/techguide/storage/0,39045058,62052121,00.htm
Recently, the Trusted Computing Group (TCG), a not-for-profit organization that promotes open standards for hardware-enabled security technologies, released final specifications detailing the standards by which all hard drives will have the capability built-in to enforce encryption at the hardware level.
Of course, not all data breaches are the result of lost or stolen hardware, but by including an encryption option right in the actual storage device, organizations can completely close one possible avenue of entry when it comes to loss of sensitive information.
Now, if one of your executives is on a business trip and loses his laptop while traveling, worries about possible information loss can go away.
The specifications developed by the team of hard drive manufacturers operate at a level that does not impact overall system performance. Today's most common encryption methods operate between the operating system and the hardware, imposing performance benefits that can sometimes be noticeable.
There are a total of four standards covering various storage elements. From the specification documents themselves:
TCG Storage Work Group Security Subsystem Class: Opal. The Opal SSC is an implementation profile for Storage Devices built to: 1) Protect the confidentiality of stored user data against unauthorized access once it leaves the owner's control (involving a power cycle and subsequent deauthentication); 2) Enable interoperability between multiple SD vendors. Think individual computers.
TCG Storage Work Group Security Subsystem Class: Enterprise. This specification is an implementation profile for trusted storage devices commonly deployed within Enterprise-class systems. It provides storage device implementation requirements needed to guarantee interoperability between storage devices from different vendors. Enterprise-class systems often deploy a mix of cross-vendor storage devices and interoperability is therefore key, both for non-trusted and trusted storage devices. This specification defines a limited set of TCG Trusted Storage functionality that, combined with Full Disk Encryption (FDE), protects the confidentiality of user data at rest. Only a single threat scenario is addressed: removal of the storage device from its host system involving a power cycle of the storage device and subsequent unauthorized access to data stored on that device. This covers the enterprise space.
TCG Storage Interface Interactions Specification. This document defines for each interface: 1) Mapping of interface events to TCG resets; 2) Mapping of IF-SEND, IF-RECV; 3) Handling of common TPer errors; 4) Discovery of security capabilities; 5) Miscellaneous issues. In short, this is the communications portion of the standard - think IDE, SCSI, etc.
Trusted Computing Group Optical Storage Subgroup FAQ. Defines a set of encryption standards that can be applied to optical storage. Note that only optical storage is included in this particular document. Other removable storage types, such as flash and solid state drives and tape devices, are not covered.
The hard drive standards have been developed jointly by Fujitsu, Hitachi, Samsung, Seagate, Toshiba, and Western Digital so that there is deep interoperability between different vendors.
I believe it's a matter of time before governments pass laws related to full-disk encryption, so these kinds of cooperative standards are welcome, as they will hopefully result in minimal consumer impact while providing maximum protection.
Catching the E-Wave!!
http://www.wavesys.com/news/media/MTmarch2009.pdf
keV, link please to support your post....
Sorry keV, it is OT:
which is an acronym for "old tecnology". Wave isn't impacted by software solutions for legacy machines.
Again,
"It seems to me that the dominoes are falling at an accelerating pace and that within two to three years, every device that ships with a hard drive or solid-state disk will offer self-encrypting drives. Chief information security officers, purchasing managers, management software vendors, and government agencies should plan for this inevitability."
The Hartford Introduces Data Privacy Coverage For Technology Companies
DJ Press Release Wire
10:22 AM Eastern Daylight Time Mar 10, 2009
HARTFORD, Conn.--(BUSINESS WIRE)--March 10, 2009--
While large-scale data security breaches are those that make news, a breach of any size can be costly for software developers, hardware firms and other technology companies that have non-public personal information in their control. Data breach laws in many states require notification and credit monitoring services for those affected, the costs for which are incurred by the company responsible for the breach. With the average per-record cost of a data breach at $202, according to a 2008 Ponemon Institute study, the cost of a breach involving just 500 records could exceed $100,000.
To address this exposure for technology firms, The Hartford Financial Services Group, Inc., (NYSE: HIG) one of the nation's largest financial services companies, has added First Party Data Privacy Expense coverage, along with Cyber Extortion Expense coverage, to its FailSafe(R) suite of technology liability coverages. These new coverages are offered as an endorsement to the FailSafe GIGA(R) and FailSafe TERA(R) policies.
"Many technology companies are at risk for improper dissemination of non-public personal information or violation of data privacy laws. This endorsement is designed to address direct costs that would not be covered by third party technology professional liability coverage," said David J. Selembo, assistant vice president of professional liability, underwriting & operations for The Hartford's Technology Practice Group, which provides insurance coverage tailored to the risks of technology firms.
The Hartford's Data Privacy Expense coverage pays for actual expenses incurred as a result of a policyholder's negligent acts, errors or omissions that result in the improper dissemination of non-public personal information, or a breach or violation of data privacy laws. Specific components of the coverage may include:
1. Notification expenses incurred to comply with notification laws.
2. Crisis management expenses incurred for fees and costs associated with hiring a crisis management firm to perform services that minimize potential harm and maintain or restore confidence in the policyholder.
3. Data privacy regulatory and credit monitoring expenses incurred in connection with a statutory mandate requiring credit monitoring for third parties in compliance with data privacy laws, legal expenses in defense of a data privacy regulation proceeding, and certain fines or penalties, where insurable, in connection with a data privacy regulation proceeding.
4. Cyber investigation expenses incurred to have a third party investigate the policyholder's computer system to determine the source of a data privacy breach.
The Hartford's Cyber Extortion Expense coverage addresses expenses incurred by a policyholder in the event of an extortion threat to cause an actual interruption, suspension, or failure of the company's computer system, including the failure to prevent unauthorized access or unauthorized use of the computer system.
To learn more about The Hartford's FailSafe suite of customized solutions for the technology industry, agents and brokers should contact david.selembo@thehartford.com, their local Hartford sales representative or a Technology Practice Group underwriter.
About The Hartford
The Hartford is one of the nation's largest financial services companies and a leading provider of investment products, life insurance and group benefits; automobile and homeowners products; and business property and casualty insurance. International operations are located in Japan, the United Kingdom, Canada, Brazil and Ireland. The Hartford's Internet address is www.thehartford.com.
jazz:
"It seems to me that the dominoes are falling at an accelerating pace and that within two to three years, every device that ships with a hard drive or solid-state disk will offer self-encrypting drives. Chief information security officers, purchasing managers, management software vendors, and government agencies should plan for this inevitability."
http://news.cnet.com/8301-1009_3-10188267-83.html
Juniper's Network Security Gets Nosier
By Sean Michael Kerner
March 9, 2009
http://www.internetnews.com/infra/article.php/3809366/Junipers+Network+Security+Gets+Nosier.htm
A consortium of vendors called the Trusted Computing Group (TCG) is developing IF-MAP as an open standard for hardware-based security. Members include HP, IBM, Intel and Microsoft. Juniper is also a consortium member, working on TCG's Trusted Network Connect (TNC) initiative, which is developing an open standard for NAC admission control.
For a time, Network Access Control (NAC) was all the rage in the enterprise networking world, offering the promise of security by validating users when they attempted to access the network.
But admission-based security control doesn't keep an eye on users after they gain access to the network.
That's where the Interface for Metadata Access Point (IF-MAP) standard comes into play, enabling multiple components of network infrastructure to communicate and to correlate security data to user activity.
IF-MAP is also a critical new component in Juniper Networks' security portfolio, which today received an overhaul with a new NAC release, new services router and new network management software update to makes sense of it all.
The new releases comes as the market for network security continues to grow, hitting $5.5 billion in 2008, according to Infonetics Research, which also found that Juniper remains No. 2 behind market leader Cisco.
With its new Adaptive Threat Management solutions, Juniper is aiming to grow share and provide an integrated approach to managing network security.
"To be frank, most devices on the network security side often operate in silos -- be it an IPS, firewall, or VPN, they operate separately," Sanjay Beri, Juniper's general manager of access solutions, told InternetNews.com. "Making sure that's not the case, so you can get the benefit of sharing information in a multivendor, open standard way, that's a big piece of this announcement."
The new security integration is coming by way of Juniper's UAC (unified access control) 3.0 technology, which is Juniper's flavor of NAC. The UAC 3 release is the first major upgrade to Juniper http://www.internetnews.com/infra/article.php/3643436">UAC since its 2.0 release in 2006.
Integrating IF-MAP is one of the new release's critical components, tying it into Juniper's other offerings for tighter security.
The upshot of IF-MAP is that it goes beyond pre-admission access control, correlating post-connection events tracked by different network security devices to enforce policies on network use even after a user has connected.
A consortium of vendors called the Trusted Computing Group (TCG) is developing IF-MAP as an open standard for hardware-based security. Members include HP, IBM, Intel and Microsoft. Juniper is also a consortium member, working on TCG's Trusted Network Connect (TNC) initiative, which is developing an open standard for NAC admission control.
The IF-MAP standard was http://www.internetnews.com/infra/article.php/3743346">first announced in April 2008, and has yet to be finalized. Still, Beri said the standard lays important groundwork for cross-vendor compatibility in the realm of post-connection activity tracking.
"There is still work to be done on IF-MAP at TCG," Beri said. "However, the spec is published on the TCG Web site and there are vendors working on interoperability ... Other vendors have integrated with us via IF-MAP. The key for us on the IF-MAP side is people don't have to come to us to integrate."
New Juniper security interface and SRX hardware
New changes in Juniper's security portfolio also make managing security policy across multiple devices easier.
Beri explained that Juniper is now offering a unified management interface, so enterprises can administer both local LAN access as well as remote SSL-VPN access though a single UAC interface.
As a result, a network administrator will now have the ability to define a security policy that is common across both remote access and the LAN side.
The unified policy framework will also introduce identity federation among both local and remote users. One result is that a network's LAN and remote areas will share users' identity and state information, so they remote users won't have to log in twice, Beri said.
Juniper is also rolling out new SRX router upgrades to its hardware portfolio. The company first debuted SRX in September 2008 as its new services router product family.
The line's two initial models, the 5600 and 5800 gateways, offered improved scalability and performance over Juniper's older ISG product line, along with added firewall and security capabilities.
With today's launch of its new SRX 3400 and 3600, Juniper is taking the capabilities of the SRX 5800 and shrinking them down into a smaller form factor.
The company is banking that this is a key selling point for companies looking for high performance but a smaller footprint. Brian Lazear, director of product management for Juniper's high-end security business unit, claimed that the SRX 5800 is the world's fastest firewall, with a dynamic services architecture that lets customers scale services as needed.
While the SRX platform competes against Cisco's ASR product lineup, Juniper's SRX is also competing against its own, older ISG routers, users of which the company is hoping to the SRX to get new performance benefits.
Lazear noted that new silicon enhancements in the SRX 3800 provide up to 175,000 connections per second of capability. The older Juniper ISG that was in the same size class could only support up to 30,000 connections per second according to Lazear.
Though Juniper's adaptive threat technology advances the bar in terms of security, there is still work to be done in advancing the platform.
"I'm a realist there are always problems to solve," Beri said. "I think the net result of what Juniper has built is a framework to innovate and do it well. The SRX let's people build more services as needed. Our policy framework with UAC and SSL all applies whether we came out with a new box or not."
"What we're really about is setting the right architecture and framework but absolutely we expect there to be more innovation," Beri added. "If there wasn't I guess we'd all be out of business."
Government should lead transition to self-encrypting drives!!!!
http://news.cnet.com/8301-1009_3-10191569-83.html
by Jon Oltsik
I've recently written about a new standard published by the Trusted Computing Group (TCG) for self-encrypting drives. With this standard, Fujitsu, Hitachi, Seagate, Toshiba, and Western Digital are shipping or will soon ship self-encrypting hard drives for laptop computers. This in turn should prompt a transition, where users will opt for systems with self-encrypting drives rather than install encryption software utilities.
To me, this conversion is inevitable since hardware-based cryptographic processing tends to lead to superior security and performance while eliminating the muss and fuss around software procurement, installation, and maintenance.
Given these benefits, I believe that the U.S. federal government should make self-encrypting drives a new standard for all federal system purchases. This would not only enhance the security of private data on federal systems but also help jump-start this tech industry transition. This is a perfect opportunity for the federal government to take the lead because:
Demand for encryption remains high. In 2006, the Office of Management and Budget instructed civilian agencies to put a plan together for laptop security within 45 days. Subsequent to this plan, agencies were supposed to encrypt all laptops. According to several estimates, somewhere between 50 percent and 60 percent of these laptops remain unprotected. If all new systems contain self-encrypting drives, federal agencies can focus their attention on a stop-gap plan for aging systems in the field.
The federal government has programs and people in place. The Department of Defense and General Services Administration have already established a "Data at rest Tiger Team" to address this problem in the defense community. It is safe to assume that this team knows what's out there, which systems are still vulnerable, and which ones are up for replacement. Adding systems with self-encrypting drives could provide this team with a new tool to accelerate this effort.
Self-encrypting drives could help secure the new Federal Desktop Core Configuration (FDCC). To improve security, federal officials are in the process of defining a set of FDCC guidelines for laptops and desktops. With self-encrypting drives, these systems will be secure upon delivery.
The Defense Department is slim on procurement people. Just last week, a team of experts told a Senate committee that the Defense Department is constrained by a lack of procurement people. OK, so here's a thought. Wouldn't it be more efficient to purchase systems with self-encrypting drives once rather than purchase systems and then purchase software? Oh, and self-encrypting drives would also eliminate the systems integration burden as well.
I could go on and on, but I think I've made my point. The federal government could improve security, lead the industry, and lower costs by embracing self-encrypting drives for all new systems. This should be plenty of motivation for federal agencies such as the General Services Administration, the Department of Defense, and others in the Beltway to get busy.
Jon Oltsik is a senior analyst at the Enterprise Strategy Group. He is not an employee of CNET.
Self-Encrypting Drives Based on New Trusted Computing Group Specifications Now Available
http://www.earthtimes.org/articles/show/self-encrypting-drives-based-on-new-trusted-computing-group-specifications-now-available,742387.shtml
Hard drive vendors have started shipping self-encrypting drives based on the Trusted Computing Group’s specifications, the group noted today. Final specifications for client drives, data center drives and interoperability of self-encrypting drives were published in late January of this year and are widely supported by PC, server, drive and applications providers.
Fujitsu has demonstrated drives based on TCG’s Opal self-encrypting drive specification, which is focused on drives for PCs, while Hitachi GST offers these drives now. Seagate is now working with early adopters IBM and LSI Corporation on data center storage devices supporting the TCG Enterprise self-encrypting drive specification.
Wave Systems Corporation currently provides solutions to set up and manage all available self-encrypting drives. WinMagic provides support and management applications for self-encrypting drives in an enterprise environment for both Windows and Mac platforms. CryptoMill Technologies also has noted its support for the TCG specifications. McAfee will look to support the TCG Opal specification to provide a choice of encryption models and implementation options to its customers.
“TCG's new storage security specifications and resulting drives from the vendors that support them have been needed for some time and address a number of high-performance, interoperability, and security concerns. This change represents a significant improvement for the storage industry and will benefit vendors as well as users who must protect their data,” noted Rob Enderle, president and founder of Enderle Group.
The new specifications give vendors a blueprint for developing self-encrypting storage devices (e.g., hard drives) that lock-down data automatically in less than a second and can be immediately and completely erased in milliseconds. Self-encrypting drives can be easily deployed in the enterprise, because drives based on TCG specifications are easily managed, have reduced cost of deployment and management, and are interoperable across PC platform types.
The TCG approach specifies encryption in the drive itself, rather than in other components of the PC. Putting cryptographic operations in the drive has a number of benefits. These benefits include the ability to encrypt the entire drive contents immediately upon device manufacture, strong protection of the encryption keys combined with strict access control, and no loss of system performance. The contents of the self-encrypting drives are always encrypted and the encryption keys are themselves encrypted and protected in hardware that cannot be observed by other parts of the system. AES and other cryptographic algorithms are supported in the specifications, and vendors can add additional security features to their devices. Because encryption is handled in the drive, overall system performance is not affected and is not subject to attacks targeting other components of the system.
Compared to encryption outside of the drive, self-encrypting drives do not interfere with system maintenance, compression, de-duplication, and end-to-end integrity metrics. In addition, the encryption key never leaves the drive, greatly simplifying key management. The enterprise benefits from these security features are reliable compliance, ease of deployment, and ease of management. Additionally, the repurposing of drives at either redeployment or end-of-life has a significantly lower cost than other options.
In the data center, encryption typically has been costly and time-consuming. This is mainly due to the demands on bandwidth. Self-encrypting drives do encryption inside of each drive, where it is cheaper, safer, and more scalable to implement than encryption in the RAID controller.
Trusted Computing Group, an industry organization that enables computing security, has created a portfolio of specifications to enable more secure computing across the enterprise in PCs, servers, networking gear, applications and other software, hard drives and embedded devices. More information and the organization’s specifications and work groups are available at the Trusted Computing Group’s website, www.trustedcomputinggroup.org.
Brands and trademarks are the property of their respective owners.
Join TCG for “Turning on the Trust,” A Full-Day Educational Workshop for the Developer and End-User Communities!
The Trusted Computing Group invites you and your customers to attend the TCG educational workshop, "Turning on the Trust: A Developer and End-User Lab to Enable Higher Security for Clients, Networks and Storage Devices”, April 20, 2009 from 9:00am – 12:00pm and 1:00pm – 4:00pm. The workshop is part of the RSA 2009 pre-conference program in the Orange Rooms 130 – 133, Moscone Convention Center, San Francisco, California.
Experience first hand:
·
How to set up and manage Trusted Platform Modules, available in most enterprise PCs
·
Client server management of trusted systems
·
How to set up authentication
·
How to set up secure email and password management
·
How to set up a secure wireless network w/trusted devices
·
How to use network access control to ID who is on the network, protect against malware and ensure systems are in policy
·
How to protect data with widely available hardware and software tools
·
How to manage keys
9:00am – 12:00pm - Trusted Computing Developer Lab
This hands-on lab will provide developers with an overview of the Secure Storage, Trusted Platform Module Solutions, Trusted Network Connect, and explain in-depth how to develop applications/products that support the specifications. Presenters will address interoperability, development on multiple platforms and environments.
1:00pm – 4:00pm - Trusted Computing in the Real World, A Lab for IT
This lab session will tackle common challenges and obstacles to enabling the Trusted Platform Module (TPM) and applications; deploying encrypted storage in the enterprise, including key management and securing networks through standards-based network access control.
Attendees will walk away with a deployment plan, checklists and ideas on using these hardware-based technologies in their enterprises with existing infrastructure and how to enable the installed base of TCG security equipment that has already been acquired.
Booth # 2133 Activity
The TCG will continue the hands-on experience by showcasing a number of interactive demos in booth #2133 on the RSA Conference 2009 show floor during this event. TCG Member Companies participating in this event are Great Bay Software, Infineon Technologies AG, Juniper Networks, Lumeta Corporation, LSI Corporation, Seagate Technology and Wave Systems. Also, visit one of over 40 TCG members that will be exhibiting during the show to receive valuable information.
Want To Attend?
Click here to register for the RSA Conference
Cyber Insecurity is Destroying Innovation, Will Hamstring US Economic Future
DJ Press Release Wire
3:55 PM (GMT-05:00) Eastern Time (US & Canada) Mar 05, 2009
Fund Cybersecurity in Stimulus Package for Digital Infrastructure
WASHINGTON--(BUSINESS WIRE)--March 05, 2009--
Today Rob Housman the Executive Director of the Cyber Secure Institute released this statement:
America's competitive advantage has always been, and needs to remain, our ability to innovate. American innovation has given rise to everything from mass production to skyscrapers to airplanes. America has brought the world four successive generations of the information age, first with the telephone, then the television, then the computer, and then the Internet.
To be successful America needs to constantly push the limits of innovation and efficiencies. We need to be out in front of the learning curve. We need to be highly entrepreneurial and technologically driven.
But American innovation can't take consumers and companies to the next level if they don't want to go there because they fear the security of their data, money, and personal privacy.
People and companies are already beginning to hold back on adopting new advances because of IT security fears. For example, a recent study by a major market research firm determined that, "Security concerns are the single biggest factor inhibiting consumer acceptance of mobile banking."
Given the inherent insecurity of digital infrastructure people have good reason to be concerned. Consider the recent track record:
-- Heartland Payment Systems recently announced what may be the single
largest data breach ever--implicating tens of millions of credit card
transactions. The breach was caused by malicious software. The company
reports it has no idea who inserted the software or how it got there.
Nor does it know exactly whose data was compromised.
-- A November 2008 study of mobile device used by of over 1,000 healthcare
professionals found that 93% of the devices were at risk. The study
found that 49% of the healthcare professionals downloaded sensitive
patient data on their devices. Over 71 percent protected their devices
and sensitive data with just a single password. At least 13% had lost
one or more devices containing sensitive information. No wonder that
numerous studies find upwards of 70-80% of Americans are concerned about
the security of their electronic medical records.
If people don't trust the security of IT systems we will never be able to fully capitalize on their promise. Smart devices have little value if smart people won't use them. Markets won't move beyond online videos and books if ecommerce increasingly becomes "eswindled."
The American private sector has been slow to realize that insecurity is compromising its ability to compete and even slower to take the steps needed to address the problem. This needs to change.
There are a variety of tools the new Administration should consider to drive this change.
The Obama Administration plans digital infrastructure funding as part of the stimulus package. These monies should be used to make the digital superhighway safer and more secure, not just extend its reach. For example, we welcome funding to deploy new technologies to improve healthcare, but if these technologies are inherently insecure we can do as much harm as good.
At the same time the new Administration should also consider a variety of other mechanisms to cajole if not compel companies to deploy inherently secure digital infrastructure. Such mechanisms could range from cyber security disclosure requirements under the Securities and Exchange Commission to actual baseline standards for critical systems.
The text of the full statement can be found at: http://www.cybersecureinstitute.org/blog/
The Cyber Secure Institute is a newly established analysis and advocacy institute dedicated to serving as the voice for effective cyber security. For more information: www.cybersecureinstitute.org.
CONTACT: The Cyber Secure Institute
Rob Housman
202-486-5874;
202-289-7999
rhousman@cybersecureinstitute.org
SOURCE: The Cyber Secure Institute
Copyright Business Wire 2009
Self-encrypting drive standard gains momentum
http://news.cnet.com/8301-1009_3-10188267-83.html
I've long been a big proponent of self-encrypting drives as the best way to encrypt data-at-rest on PCs and storage systems.
This belief became a lot more real in January when the Trusted Computing Group (TCG) published three storage encryption standards for laptops, enterprise storage, and software interoperability. Fujitsu, Hitachi, Seagate, and Toshiba support these standards and are already shipping self-encrypting drives.
In February, IBM joined the fray, further validating the self-encrypting drive standard. IBM announced that its massive DS8000 storage system will now offer self-encrypting drives to protect the confidentiality and integrity of data-at-rest. LSI, another leading storage system vendor, is also on board.
I have to believe that Fujitsu and Hitachi will soon follow this trend. Both companies currently offer encrypting storage systems that use a cryptographic processor resident in their storage controllers. Since both companies supply self-encrypting drives, it is likely that they will replace encrypting controllers with self-encrypting drives in future product revisions.
It seems to me that the dominoes are falling at an accelerating pace and that within two to three years, every device that ships with a hard drive or solid-state disk will offer self-encrypting drives. Chief information security officers, purchasing managers, management software vendors, and government agencies should plan for this inevitability.
Jon Oltsik is a senior analyst at the Enterprise Strategy Group. He is not an employee of CNET.
Introduction to SSD's
http://www.legitreviews.com/article/907/1/
PC performance has been steadily increasing in terms of CPU, RAM, video cards and motherboard chipsets. The system bottleneck that has developed as a result is the ability to read and write the data in proportion to the speed of the rest of the components. We have seen hard drives go from 5400 to 15,000 RPM in an effort to alleviate this bottleneck but it has come at the expense of increased noise, energy utilization and heat while limiting capacities. Enter the age of Solid State Drives (SSD).
For those who haven't kept up on the latest technology, let me start out with a little bit of an overview of the technology. In short, SSD's are devices that use solid state flash memory chips to store persistent data in large capacities. If you've used USB flash drives, most low to mid-capacity iPods or other multimedia portable devices, odds are that you have used this technology already. SSD's have actually been around since the late 70's, but limited capacities and the high cost of manufacture kept them mostly out of the consumer product market for many years.
In the last few years, SSD's have started reaching non-enterprise consumers as prices have dropped while performance and capacities have increased. This is partly due to the proliferation of flash based devices making large scale manufacturing more affordable as well as ongoing technology advances. Those familiar with product "binning" know that due to variability in the manufacturing process, identically produced products turn out variable levels of performance. Better performing components tend to be binned and sold as higher performing components with a price premium and generally allow for greater flexibility for enthusiast tweaking (overclocking). SSD's are generally built with high-binned chips while the lower performing chips make their way to USB flash drives and other devices.
There are two types of SSD's, those that use Multi-Level Cell (MLC) flash memory and those that use Single-Level Cell flash memory; the former is slower and more affordable, making them cheaper and more consumer oriented while the latter is faster and more expensive, generally being utilized at the enterprise level. Both are more durable than conventional hard drives as there are no moving parts to cause problems. Now that we've gotten that out of the way, let's move on to more exciting things.
Most SSD's released in 2008 had great read performance but their write performance sometimes lacked depending on the circumstances in relation to traditional spinning platter hard drives, specifically when it came to random writes. Most used a JMicron controller and had little if any onboard cache which resulted in what most users term "stuttering" during use. Generally, if some tweaks are employed, this can be mitigated or eliminated entirely depending on the user's ability and willingness to do so. Much of this can be traced to Windows as anything prior to Windows 7 was designed for optimization using traditional hard drives and are not particularly well suited for SSD use as configured by default. This is especially true with the way XP aligns the partitions by default. This is no knock on Microsoft as SSD's simply were not widely used when XP and Vista were coded and tested. Additionally, many drives lacked large enough caches to adequately support the performance desired.
Five Steps for Keeping Data Safe and Secure
http://www.smallbusinesscomputing.com/article.php/3807596
By Jennifer Schiff
February 27, 2009
According to the Identity Theft Resource Center, there were 656 known data breaches that exposed nearly 35.7 million records last year. These breaches occurred at businesses, financial institutions, medical facilities, educational institutions and government agencies.
The main cause of the breaches? According to ITRC, only 2.4 percent of the organizations that experienced a breach had encryption or other strong protection methods in use, and only 8.5 percent of the breached information was password protected.
So why aren't more organizations password protecting and encrypting data? Some are complacent, while others falsely believe their data is already properly protected. Another is that some organizations fear having to spend large sums of money and time on new software or hardware to properly encrypt data.
Yet increasingly the monetary and public relations cost of having — and having to report (now required by law in most states) — a breach is so high that it behooves organizations to implement rigorous data protection policies and standards. And these policies and standards needn't be complex or expensive.
So while data storage vendors like Sun, EMC, HP and IBM debate encryption key management standards, here are some steps you can take to protect your data now.
Start With a Good Data Protection Policy
Indeed, security expert Adam Levin, chairman and co-founder of Identity Theft 911, argued that a good data protection policy involves just five things:
Instituting good security and privacy policies for collecting, using and storing sensitive information.
Using strong encryption when storing information on computers and laptops.
Limiting who has access to sensitive information.
Safely purging old or outdated sensitive information.
Having an incident response plan in case a breach occurs.
In addition to the above, Levin also suggested that organizations have firewalls, anti-spyware and antivirus protection in place and kept up to date; refrain from using wireless networking technologies (Wi-Fi); and truncate data so that sensitive information is not used where it is not needed.
But the most important thing, he reiterated, was to "make sure you have secure, encrypted ways of obtaining and storing sensitive information — and employ encryption protocols and encrypt all sitting data."
Encrypt, Encrypt, Encrypt
The Trusted Computing Group (TCG), an industry organization that develops specifications for computing security across the enterprise, also believes that good encryption is essential for properly protecting data. Data protection vendors have taken note and are busy developing new and improved software- and hardware-based encryption solutions, on both the client (such as laptops and USB drives) and enterprise level.
In December, BitArmor, for example, announced the release of its BitArmor DataControl software version 3.2, an information-centric security solution that uses full disk encryption and persistent file encryption to directly protect data rather than the devices or network used to access it. Last month, the company pledged to refund the entire purchase price of its software if BitArmor-protected data was breached (though not necessarily the costs involved with any lost or breached data), it was so confident in its approach.
"Full disk encryption is important front-line protection," stated Patrick McGregor, the co-founder and CEO of BitArmor, especially with more and more employees using laptops and USB drives, which are easy to steal or lose. And while you can use self-encrypting USB drives to protect that data, those drives can be expensive, said McGregor. That's one of the reasons why BitArmor took a software-based approach.
Instead of deploying separate data protection solutions for PCs, laptops, USB drives, e-mail attachments, application servers, storage servers and various networks, BitArmor says its software lets organizations protect and manage all data with one product, eliminating the need for multiple point solutions for data security and data management. With this approach, data doesn't have to be decrypted and re-encrypted as it passes from device to network and vice versa. And because it is now centrally managed, data can be more easily tracked throughout the enterprise.
BitArmor's approach has already attracted at least one fan in the analyst community. "BitArmor approaches the data protection problem in a unique way, i.e., by embedding protection policies with the data itself, and not by protecting just the devices where data resides," said Jon Oltsik, senior analyst at Enterprise Strategy Group. "I strongly believe this information-centric approach is the future of data protection."
Yet BitArmor's is far from the only approach.
The Encrypt Keeper
In late January, the Trusted Computing Group released final versions of three storage specifications — one designed for PC clients, one designed for data center storage, and one that focuses on interactions between storage devices and underlying SCSI and SATA protocols — that it said would "enable stronger data protection, help organizations comply with increasingly tough regulations and help protect important information from loss and left."
These new specifications are important, said Robert Thibadeau, the chair of the Trusted Computing Group Storage Work Group and chief technologist at Seagate Technology, because they "give vendors a blueprint for developing self-encrypting storage devices (such as hard drives) that lock data, can be immediately and completely erased, and can be combined with the Trusted Platform Module ... for safekeeping of security credentials."
And already several vendors, among them Seagate, Hitachi and Fujitsu, are busy developing self-encrypting hard drives for both the client and the enterprise.
The advantages of using self-encrypting drives (built around the TCG's Enterprise Security Subsystem Class Specification for data center storage) are manyfold, said Thibadeau. One, the drives can be easily slotted into RAID units or SANs, and also used on the client level in laptops. Two, "the concept provides an enormous increase in the transparency and ease of use around encryption." And three, "by pushing encryption into the drive, you never have to actually manage encryption software or an encrypting controller, and have simplified and greatly reduced the total cost of ownership around putting encryption into the data center."
The drives also solve the problem of protecting data at rest (when drives are unplugged or powered down) as well as the need to safely destroy data.
The Data Destruction Dilemma
On the topic of safely destroying data, what Thibadeau refers to as cryptographic erase, not only can self-encrypting hard drives make the process easier, it can make destroying data faster and also more cost-effective.
"A lot of data centers have gotten used to full-out destruction of drives," said Thibadeau. "They'll just put the drives into a macerator and grind them up into little particles when they want to decommission a drive. With the cryptographic erase, they don't have to do this. They can repurpose the drive. And the time it takes to erase the drive is on the order of milliseconds, as opposed to a couple or three hours."
But what about the cost of purchasing self-encrypted drives? According to Thibadeau, the costs are relatively small. "If you go to TigerDirect and type in 'Black Armor,' for $60 you can get a 160GB self-encrypting drive from Seagate," he said. "And if you're repurposing drives, you don't have to buy new drives." So right there is a potentially large cost savings.
And for those administrators concerned about a performance hit, Thibadeau said there isn't one. "Unlike most of the other solutions you'll see out there, the I/O speed of the drive is unaffected by whether it's encrypting or not. The drive's just reading and writing just like a normal drive that's not encrypting, though in fact it is." That's the beauty of it, he said. "It acts just like a regular drive, unless a thief gets a hold of it."
And if, by some chance, a thief does get a hold of it? "Without a good cryptographic key that unlocks it, the likelihood would basically be zero" that someone would be able to unlock the data, he said. "It's impossible to do it. Even at Seagate we don't know how you would do it."
The real problem with encryption, at least in the past, Thibadeau said, was the management of it. The more difficult a system or data is to manage, the greater the chance a mistake will be made and information exposed.
"If the world were perfect and everyone was doing everything perfectly, things like software encryption and controller encryption would work," Thibadeau said. But "people make mistakes." And if organizations don't use self-encrypting drives, when somebody swipes a drive from a data center, truck or laptop, the information is exposed. Whereas if the data has been stored on a self-encrypting drive, the information is useless if stolen. It can't be decrypted. The self-encrypting drive makes the problem of properly encrypting and destroying or erasing data go away.
As for availability, vendors are currently only shipping client-level drives, typically with laptops. However, Thibadeau expects that a number of major vendors will be announcing the release or availability of enterprise-level drives in the next three to six months.
This article appears courtesy of EnterpriseStorageForum.com.
Seagate Toughens Security With Encrypted Notebook Drive
http://inet1statisticsinfos.blogspot.com/2009/02/seagate-toughens-security-with.html
Friday, February 20, 2009
Seagate Technology enjoy be practical by deliver a laptop easier said than done drive near incorporated hardware encryption inside favour of a two of a benevolent of years, but presently the technology be sooner or later hitting the street, the business announced Monday.
Seagate's bright Momentus 5400 FDE.2 (full disk encryption) use a government-grade surety protocol to encrypt all hard drive anecdote transparently and short need to ask, without the CPU regular bang that come with highest software-based encryption solution. The laptop drive is aimed squarely at making it easier for enterprise and government to directive easily distressed procedures without the peril of endorsement away.
"People have been conversation going on for laptop drive encryption for a extensive juncture, but working out the niceties of the authentication and coerce button encryption scope of it has been provoking
The path Seagate has it matching inception is pretty purloin home colder -- with the key encryption build appropriate into the hardware," King chronic. "There be drive encryption technology out near, but they tend to be software base and you resolve alert complementary for them. Having encryption onboard make a deeply good give an undertaking of be aware of from both an ease-of-use and utility standpoint." ASI will liner its C8015+ archetype, which will management an 80 GB Momentus FDE.2 drive. The laptop will come with 2.0 GHz Intel (Nasdaq: INTC) Core 2 Duo Mobile supercomputer and will thorn a built-in biometric fingerprint reader. The eroded motorway price stub is US$2,150.
Last year, Seagate provide its Momentus 5400 FDE drive to at smallest one innovative outfit businessman, but go a stumpy time ago didn't set stale. Seagate isn't adage it go hind to the doodle page of kindling exactly, with the matchless role that it has made the drive even easier to use. The press with encryption for call a halt user is managing the data -- if an end user forget a password or if a fingerprint reader fail, the drive is basically unusable unless the user's IT department has a rarefied backup jut out over in fix.
Wildman,
Judging from the disccussion of the Samsung SSD drives, I'd say it's new.
FM
Must Watch Dell Webinar!
"Managing Data Security in Your Mobile Environment"
Talks about Samsung and their SSD drives and Seagate's FDE with Wave's EMBASSY Suite.
http://dell.awakit-webcasts.com/default/index/watchWebcast/webcast_id/1
Trustco
It's complementary and to be used in data centers with multiple drives/multiple vendors. This isn't for the PC/laptop mkt.
FM
EMC, HP, IBM Team Up On Encryption Key Standard
http://www.enterprisestorageforum.com/continuity/news/article.php/3802771
Seven IT vendors have banded together to propose an encryption key management standard, an important step toward a more comprehensive data security approach (see Storage Security Is More than Just Key Management).
Brocade (NASDAQ: BRCD), HP (NYSE: HPQ), IBM (NYSE: IBM), LSI (NYSE: LSI), EMC's (NYSE: EMC) RSA Security Division, Seagate (NYSE: STX) and Thales (formerly nCipher) said the jointly developed specification for enterprise key management will "dramatically simplify how companies encrypt and safeguard information" and "enable the widespread use of encryption."
The Key Management Interoperability Protocol (KMIP) will be submitted to OASIS (the Organization for the Advancement of Structured Information Standards) for advancement through the organization's open standards process.
"KMIP addresses an important piece of the secure storage puzzle: key management," said Walt Hubis, software architect for LSI's Engenio Storage Group. "Acceptance of the KMIP will eliminate the biggest barrier to the widespread adoption of storage encryption, the fear that encrypted data will be lost. Interoperability across encryption and key management systems is a significant advancement that will enable acceptance of self-encrypting drives (SEDs). There's still work to be done in areas such as network-attached storage, including authentication of users in NAS and storage virtualization environments, but this is a critical step in the right direction."
Enterprise Strategy Group security analyst Jon Oltsik said, "We don't have a standard yet, but this is very encouraging since all of the industry leaders are behind the effort. This should drive further progress in IEEE P1619.3 as well. This group was a bit overwhelmed by the scope of work, but now KPIM limits what they have to focus on to storage devices."
IEEE P1619.3 is the key management part of the IEEE P1619 encryption standards effort.
KMIP was developed by HP, IBM, RSA and Thales, with Brocade, LSI and Seagate joining the effort. All seven companies will now contribute to OASIS for ongoing development of the protocol.
Charles Kolodgy, research director at IDC, said the proposed standard could remove an important barrier to adoption of encryption. "Time and time again, our research shows the primary barrier to the widespread use of encryption is the fear that encrypted data will be lost," he said.
Current key management strategies can be difficult, often requiring manual efforts to generate, distribute, store, expire and rotate encryption keys. KMIP offers a "single, comprehensive protocol for communication between enterprise key management services and encryption systems," the companies said, enabling encryption keys to remain accessible and secure.
The key lifecycle management protocol can be used by legacy and new encryption applications, and supports symmetric keys, asymmetric keys, digital certificates and other "shared secrets." It defines the protocol for encryption client and key management server communication for generation, submission, retrieval and deletion of cryptographic keys.
KMIP is complementary to application-specific standards projects such as IEEE 1619.3 for storage and OASIS EKMI for XML, the companies said.
More information can be found at http://xml.coverpages.org/KMIP/.
Excert: To err is human...
http://www.expresscomputeronline.com/20090216/edit01.shtml
Storage encryption is becoming more important in a world where laptop use is growing and laptop theft is a pressing concern. The Trusted Computing Group has published three standards for storage encryption—one for PC hard drives, another for enterprise drives and the third for interoperability with other standards such as SCSI and ATA. Thanks to these hardware-based encryption standards, software encryption is going to lose ground. The latter is more complicated and needs to be deployed unlike hardware-based encryption, which comes built-in on drives that support it. Expect business laptops to offer this feature for a premium in the near term. IT departments will have to figure out how to manage these systems. Enterprise storage will also be affected though this may take longer.
DOE seeks new approach to cybersecurity
http://www.1105newsletters.com/t.do?id=2159726:11146957
DOE seeks new approach to cybersecurity
By William Jackson
Feb 12, 2009
Reactive approaches to information security have not kept pace with the rapidly evolving information technology environment, and a panel of experts examining the state of security at the Energy Department has recommended a fundamentally different approach.
The traditional layered wall-and-moat approach to physical security is not well suited to complex information systems whose development and use are unpredictable, the panel concluded in its report, “A Scientific Research and Development Approach to Cyber Security.”
“Today’s cybersecurity methods, policies and tools have proved to be increasingly inadequate when applied to the exponentially growing scope, scale and complexity of information systems and the Internet,” the report states. For instance, the availability of small, powerful USB drives easily circumvents many security measures. “Innovation is needed in many areas — ranging from better authentication protocols to stronger encryption to better understanding of social and human factors.”
The report recommends a program to apply scientific research to the problem, which could enable security to take a leap ahead of emerging threats and vulnerabilities instead of being condemned to a continual game of catch-up.
“Peer-review processes must be used to identify the best research ideas,” the report states. “Opportunities for dissemination of research results — through workshops, conferences, traditional publications or online journals — will be an important consideration in engaging the open science community. Involvement of postdoctoral researchers and students in this effort will help build the pipeline of trained cyber professionals.”
DOE undertook the study because of its heavy reliance on IT and its mission to protect the nation’s energy systems and nuclear stockpiles.
“Despite ubiquitous dependence on electronic information and on networked computing infrastructure, cybersecurity practice and policy [are] largely heuristic, reactive and increasingly cumbersome, struggling to keep pace with rapidly evolving threats,” the report states. “Advancing beyond this reactive posture will require transformation in information system architecture and new capabilities that do not merely solve today’s security challenges — they must render them obsolete.”
A community of cybersecurity professionals and researchers from DOE laboratories, the private sector, academia and other agencies conducted a series of workshops to assess the state of cybersecurity in general and at DOE specifically. “The conclusion reached is that the department should develop a long-term strategy that applies science and mathematics to develop information system architectures and protective measures that go beyond stopping traditional threats to rendering both traditional and new threats harmless,” the report states.
The department sees itself as uniquely qualified to play a leading role in the cybersecurity research and development area because of its reliance on IT infrastructure for a mission that includes classified and unclassified work and basic scientific research.
The panel identified the following three focus areas for research.
Mathematics: Predictive Awareness for Secure Systems. The goal is to examine system and network behavior to anticipate failures or attacks, including real-time detection of anomalous activity.
Information: Self-Protective Data and Software. DOE should create active data systems and protocols to enable self-protective and self-healing system.
Platforms: Creating Trustworthy Systems from Untrusted Components. DOE should develop techniques for maintaining the integrity and confidentiality of a system comprising components for which there are varying degrees of trust.
About the Author
William Jackson is a senior writer for GCN.
Not sure what this is......
http://java.sys-con.com/node/840922
Before I discuss this month's active JSRs here's a quick addendum to my recent column on security. In my list of security-related JSRs I omitted JSR 321: Trusted Computing API for Java, which plans to develop a Trusted Software Stack for Java providing comparable functionality to that offered by the C-language.TSS developed by the Trusted Computing Group. My apologies to the spec lead, Ronald Toegl of the IAIK Graz University of Technology in Austria.
There are several JSRs worth noting this month. (See the Focus on JSRs section on the JCP homepage (http://jcp.org/en/home/index) or subscribe to our mailing list for full details.)
First, congratulations to SK Telecom, the spec lead for JSR 298: Telematics API for Java ME, which made its final release in October. This JSR defines Java APIs to enable mobile devices to access and control various devices in cars. For example, you could set the climate-controls, unlock the doors, or control the anti-theft system from your phone. Before too long, every device and every machine will be connected!
Another Java ME JSR, first released in 2006, is JSR 256: Mobile Sensor API. Led by Nokia and now in its third Maintenance Review, this JSR defines APIs for managing sensors embedded within mobile devices. Many cellphones, for example, contain sensors for reading the battery charge level or the network field intensity. Some, following the lead of Nokia who were the first to do so, include an accelerometer that can be used to sense the orientation of the screen (as in the iPhone), to enable certain phone or application functions to be controlled by gesture, or even to control a radio-controlled model car.
There have also been several developments in the Java EE space this month. Firstly, JSR 243: Data Objects 2.0, led by Sun Microsystems, reached its Maintenance Release 2. JDO defines a standardized API to enable the persistence of plain old java objects (POJOs) to any type of data store. Version 1 of the specification was defined by JSR 12 back in 2002. Starting with version 2.0 the JSR has been developed and maintained as an Apache project, and the Reference Implementation (RI) and the compatibility test suite (the TCK) have been developed collaboratively and released as open source. (This is therefore yet another example of an "open and transparent" JSR.)
JSR-314: JavaServer Faces 2.0, led by Star Spec Lead Ed Burns and his colleague Roger Kitain from Sun Microsystems, entered its second Early Draft Review. As I reported in an earlier column, this JSR is also being run in a very open manner, with significant public participation. The JSR aims to greatly simplify the process of building user interfaces for Java server applications, enabling developers to quickly build web applications by assembling reusable UI components in a page, connecting these components to an application data source, and wiring client-generated events to server-side event handlers. If you're interested, join them at java.net.
Last but by no means least, this month also saw an Early Draft Review of JSR 316, the specification for the next version of the Java EE platform (Java EE 6). Led by Roberto Chinnici and Bill Shannon from Sun Microsystems, this JSR will focus on making the platform more modular (by defining profiles, and supporting "extensibility"), will prune some older technologies that are no longer widely used, and will continue to promote the ease of development theme that was started in Java EE 5. You can find a useful summary of the scope of the JSR in this article.
By the way, if you're wondering what's happening in the Java SE world, check out the OpenJDK project at java.net. The developers are trying a different approach - working out the new ideas through open-source projects first, before later submitting them as JSRs. I'm sure they'll welcome you if you'd like to participate.
Finally, talking of participating, once again I'd like to remind you to vote in our elections. (If you're reading this online you probably still have time.) Please visit our election website to see the full list of candidates, to read their campaign materials, and to cast your vote. We need everyone's input, and the representatives you choose for the Executive Committees will help to determine the kind of organization we become. You are the Community in the Java Community Process.
Obama orders cybersecurity review
http://gcn.com/articles/2009/02/11/obama-begins-cybersecurity-review.aspx?s=gcndaily_120209
President Barack Obama has directed his security advisers to conduct an immediate review of the government’s cybersecurity plan, programs and activities.
Administration officials said that the interagency review would develop a “strategic framework” to ensure that the government’s cybersecurity efforts are integrated and coordinated with Congress and the private sector.
The review will take 60 days and will be done by the president’s homeland security and national security advisers. It will be led by Melissa Hathaway, who has served as senior adviser and cyber coordination executive to the Director of National Intelligence and has played a leading role in coordinating the government’s Comprehensive National Cybersecurity Initiative.
Hathaway will hold the post of acting senior director for cyberspace for the National Security and Homeland Security Councils during the review, officials said.
“The national security and economic health of the United States depend on the security, stability and integrity of our nation’s cyberspace, both in the public and private sectors," said John Brennan, assistant to the president for counterterrorism and homeland security, in a statement.
A report released in December by a panel of cybersecurity experts and lawmakers recommended Obama create a new cybersecurity directorate in the National Security Council (NSC) to develop and manage a comprehensive national security strategy for cyberspace. The panel, the Center for Strategic and International Studies’ Commission on Cyber Security for the 44th Presidency, also suggested using the United States’ approach to nuclear nonproliferation as a model for cybersecurity efforts.
James Jones, Obama’s national security adviser, told European leaders at a security conference in Munich on Feb. 8 that the NSC is evaluating how to update its “capacity to combat the proliferation of weapons of mass destruction while also placing a far higher priority on cybersecurity,” according to a transcript of his speech.
The Obama administration’s homeland security platform, released in January, calls for a national cyber adviser that would coordinate federal cybersecurity efforts and report directly to the president.
Enabling drive-level encryption is only the beginning
http://blogs.techrepublic.com.com/security/?p=792
Date: February 11th, 2009
Author: Tom Olzak
The TCG released a new standard for encryption at drive-level. Sounds good, but how does this actually impact enterprise data encryption efforts, particularly pre-boot authentication?
——————————————————————————————————————-
Last week, the Trusted Computing Group (TCG) made an announcement which caused a burst of articles about disk encryption. The TCG released a new standard for encryption at drive-level. Sounds good, but how does this actually impact enterprise data encryption efforts, particularly pre-boot authentication (PBA)?
Encryption at the hardware/firmware level is not a new concept. Seagate, for example, announced a product called DriveTrust in 2006 which uses proprietary technology to encrypt its drives. DriveTrust uses AES or other standard encryption algorithms. So what why the hubbub about the TCG standard?
The TCG standard
The members of the TCG Storage Work Group decided that not having a standard drive-level encryption method was a bad idea. So, they created a set of security standards covering a wide range of storage technologies, including ATA, Serial ATA, SCSI, FibreChannel, USB Storage, IEEE 1394, Network Attached Storage (TCP/IP), and iSCSI. And the standard is not limited to magnetic storage. According to the TCG, storage systems include disk drives, removable media drives, flash storage, and multiple storage device systems.
The security standards set augments the TCG Storage Architecture Core Specification and includes three subsystem classes: end-user device (Opal), enterprise, and interface interactions. According to a January 27, 2009 TCG press release,
The Opal specification outlines minimum requirements for storage devices used in the PC client and enterprise markets. It outlines for vendors required and optional TCG capabilities and it specifies how to activate and customize the trusted storage device. Some vendors are starting to ship products based on the OPAL specification and have demonstrated how these are interoperable with those from other vendors.
The Enterprise Security Subsystem Class Specification extends the concepts of trusted storage devices to those used in data centers and high-volume applications, where typically there is a minimum security configuration at installation, a requirement to bring devices on-line quickly and the need for high performance with low overhead. The specification defines encryption of data on media and enables support for strong access control to support organizational security.
Finally, the Storage Interface Interactions Specification specifies how the TCG’s existing Storage Core Specification and the other specifications interact with other specifications and standards for storage interfaces and transports. For example, the specification supports a number of transports, including ATA parallel and serial, SCSI SAS, Fibre Channel and ATAPI. It was developed with input from representatives of those organizations. Importantly, it enables interoperability of trusted drives in legacy environments.
Documentation for each of these standards, including FAQs, is available for download at the TCG Web site.
This all sounds good. Standardized drive-level encryption. Administrators can encrypt end-user devices are placed into users’ hands. Future drives using the standard will enable data center engineers to implement server and disk array encryption on the devices themselves regardless of the hardware manufacturer. In both cases, end-user devices and data center systems, performance overhead due to encryption will be minimized. Finally, no centralized encryption management system is necessary… well, not so fast.
The specification does not include centralized management of an encryption solution. This will fall to encryption vendors to develop. However, the TCG provides interfaces to allow development of management software which will work across all types of drives. So this might not be a problem, assuming software development keeps pace with the release schedule of new drive technology.
Why centralized management is important
The biggest issue I face when managing an end-user device encryption environment is lost passwords. Passwords are lost when users have a lapse of memory or when disgruntled employees refuse to provide them as their being escorted to the exit. The standard does allow for multi-user access via different passwords. So an organization can deploy devices with both a user account and an account only for administrators. Sounds good on the surface, but there are issues when encryption passwords and keys are not managed from a central server.
If a user forgets his or her password, it can’t be reset without an admin having access to the device. To avoid this issue, organizations would have to maintain a secure disk password repository, assuming each user had different password. Not giving each user a different password means that every system has to be “touched” whenever an employee leaves the company.
If the administrator password is released to unauthorized personnel–we all know this never happens–every user or data center drive with that password is potentially vulnerable.
Regardless of whether PBA is controlled at the drive level or by software loaded into the master boot record, the protection provided is only as good as the authentication method used. In other words, weak passwords or password policies result in weak encryption. So in addition to using strong passwords, security policies (e.g., maximum login attempts) must also be enforced.
Organizations can implement other authentication methods to enable PBA. Biometrics is a good password replacement if multi-factor authentication isn’t necessary. Yet, fingers and sensors don’t always cooperate. There are times when entering a password is the only login option—unless your company accepts user’s taking a day off on the business.
The bottom line? Drive-level encryption is conceptually a great idea, but it is only one component of a successful enterprise encryption strategy. Without supporting management technology, encryption can cause more problems than it solves. And just because it is available and easy to turn on doesn’t mean you should enable it. Wisely choose when and what to encrypt. Make sure you understand and have planned for all the challenges faced when rolling out encryption. Otherwise, you might have a second chance to get it right at a new place of employment.
40% of hard drives bought on eBay hold personal, corporate data
Buyers found data on everything from corporate spreadsheets to e-mails and photos
http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9127717&source=NLT_SEC
Lucas Mearian
February 10, 2009 (Computerworld) A New York computer forensics firm found that 40% of the hard disk drives it recently purchased in bulk orders on eBay contained personal, private and sensitive information -- everything from corporate financial data to the Web-surfing history and downloads of a man with a foot fetish.
Kessler International conducted the study over a six-month period, buying up disk drives ranging in size from 40GB to 300GB from the United States and Canada. The firm, which completed its research about two weeks ago, bought a total of 100 relatively modern drives, the vast majority of them Serial ATA.
"With size of the sample, I guess we were surprised with the percentage of disks that we found data on," said Michael Kessler, CEO of Kessler International. "We expected most of the drives to be wiped -- to find one or two disks with data. But 40 drives out of 100 is a lot."
Kessler believes the drives were likely from computers sold to third-party resellers that dissassembled them and sold off the parts.
Kessler's engineers had to use special forensics software to retrieve data from some of the hard drives, but other drives contained sensitive data in the clear, having never been overwritten or erased. The data included personal documents, financial information, e-mails, DNS server information and photographs.
"The average person who knows anything about computers could plug in these disks and just go surfing," Kessler said. "I know they found a guy's foot fetish on one disk. He'd been downloading loads and loads of stuff on feet. With what we got on that disk -- his name, address and all of his contacts -- it would have been extremely embarrassing if we were somebody who wanted to blackmail him."
Kessler said his company specifically avoided buying drives whose sellers indicated that the drives had been erased.
Kessler International offered this breakdown of the kind of data it retrieved: Personal and confidential documents, including financial information, 36%; e-mails, 21%; photos, 13%; corporate documents. 11%; Web browsing histories, 11%; DNS server information, 4%; miscellaneous data, 4%.
"We were more concerned with searching for people's identification, which is what we found, but we were surprised by all the corporate spreadsheets and business finance records we found," Kessler said.
The forensics firm even found one company's "secret" recipe for French fries, Kessler said.
In recent years, hard drives have shown up on eBay that contain all kinds of sensitive data. In April 2006, Idaho Power Co. learned that drives it thought had been recycled had actually been sold on eBay with the data still intact. The Boise, Idaho-based utility had used the drives in servers; when bought on eBay, the drives still contained proprietary corporate information such as memos, customer correspondence and confidential employee information.
And in 2007, a supposedly new hard drive purchased on eBay was found to contain information from the Arkansas Democratic Party.
Charles Kolodgy, an analyst with research firm IDC in Framingham, Mass., said drives from PCs are mostly easily protected even after resale by using a full-disk encryption (FDE) product, but he said prior to selling an old machine, users should still format the drive and use overwrite tools just to be sure. "But if you have FDE you don't need to be as concerned if something falls through the cracks," he said. For larger hard drives, disks should be erased using industrial degaussers. As for the drives Kessler purchased from eBay, the company plans to use a U.S. Department of Defense-grade degausser and erase the data. It will then either throw out the drives or re-use the models with sufficient capacity.
IBM comes up with raft of new services
IBM has announced an initiative that it claims will unite the digital and physical worlds, creating the"building blocks for 21st century infrastructure".
IBM was addressing the global proliferation of data said IBM's Nick Drabble. "There's been an explosion of data, a growing number of data-aware devices all with intelligence that we can use, but don't."
He pointed out that much of the information was contained in isolation."Everything's interconnected now," Drabble said. "But there are lot of inefficiencies in the way we work. For example, at any one time, 85 percent of the world's computers are lying idle. By using only 15 percent of capacity, we're driving huge inefficiencies as well as heavy use of power for cooling."
He also pointed out that companies spent about 75 percent of their resources, just maintaining systems rather than buying new products and services to develop their businesses. "We should be helping companies to use their assets in a smarter way," he said.
Drabble said that the new initiative would offer software across a variety of industries, One of the new products on offer, IBM Service Management Industry Solutions, will be customised for a variety of different industries, The product will bring together software from a variety of IBM divisions. Drabble said that this co-operation was key as IBM had experience in such a vast array of fields that it could bring its experience to bear.
IBM is also introducing a new governance consulting practice to mitigate global risk and a new consultancy to help enterprises design and implement service management strategies.
Other part of the announcement includes Tivoli Service Automation Manager software, to automate the design, deployment and management of services such as middleware, applications, hardware and networks, tasks that today are largely done manually.
"There's a great cost associated with staffing said Drabble. "I'm not saying that we should be getting rid of thousands of workers but those people could then be deployed on new projects."
A key part of the new initiative is security; the company has announced full-disc encryption on its IBM System Storage DS8000, to improve security through self-encrypting drives. Drabble said that the combination of the DS8000 and the Tivoli Key Lifecycle Manager would greatly improve security at a time when enterprises were focused on the cost of data breaches.
Other aspects of the new initiative include new data security services and software for IBM's system z mainframe.
"It's an ambitious initiative," agreed Drabble but he said that companies were not necessarily going to take every offering. "This is aimed at big enterprises and SMEs and they can take what they want. "There's something here for everyone," he added,.
Keeping stored data safe within company walls
Storage professionals protect data with encryption and key management
http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=332017
Stacy Collett, February 9, 2009 (Computerworld)
BECU, Washington state's largest credit union, used to keep its stored data locked down using an appliance to encrypt data before it was stored to tape. But when it had the opportunity to upgrade storage equipment, the company chose a simpler, cheaper and perhaps more secure option -- an application that encrypts tapes in the tape library.
The appliance "was the best solution at the time," says Kathryn Antonetti, IT systems and security manager at Tukwila-based BECU, a not-for-profit financial cooperative with assets of more than $8.5 billion. "Now encryption is being offered at virtually every layer." The switch eliminated maintenance and training costs for the appliance, and other headaches. "I had [three vendors] pointing fingers at each other" when the system had problems, she adds.
Protecting stored information is the next wave in data security. "We're starting to see more emphasis on data at rest," says Robert Rosen, former president of IBM user group Share and CIO at the National Institute of Arthritis and Musculoskeletal and Skin Diseases in Bethesda, Md. "It's kind of a no-brainer. If you've done it, your [data is] protected and you don't have to worry about it."
As companies upgrade their storage equipment, many are taking advantage of technological advances such as tape drive encryption, tape library encryption and enhancements in the way encryption keys are managed. There has also been progress in adopting the disk and tape encryption specifications of the IEEE P1619 standard, says James Damoulakis, chief technology officer at storage services provider GlassHouse Technologies Inc. "Still, it's fair to say that [storage security] has lost some momentum" because of policy and process limitations, says Damoulakis, who is a Computerworld columnist.
"There's a feeling that [data in storage] is a locked door -- so it's not a high priority," Rosen says. "But I think that's ultimately going to change with the turnover of equipment."
"Unfortunately, most companies wait until the problem exists before fixing it," says Ari Kaplan, a senior consultant at Datalink Corp. in Chanhassen, Minn., and former president of the Independent Oracle Users Group.
With data security breaches now costing companies $202 per compromised record, according to the Ponemon Institute, it's time to start locking down data at rest. Here are three techniques for protecting stored data.
Encryption
Gartner Inc. has found that companies that encrypt stored data do so because they have to, not because they want to. "There are regulatory compliance pressures -- PCI or HIPAA," says Gartner analyst Eric Ouellet, referring to the Payment Card Industry Data Security Standard and the Health Insurance Portability and Accountability Act. "Or it's the fear that the tape will fall off the back of the truck and you'll have a disclosure issue."
Multimaster Keys
The potential for security leaks inside a company is often overlooked, industry watchers warn. "A lot of time is focused on outside intruders, but where I see the gap is companies aren't really protecting themselves against inside threats," says Ari Kaplan, a senior consultant at Datalink and former president of the Independent Oracle Users Group. "It would be easy for disgruntled employees to get secure information from inside their company." In these cases, encryptionencryption may not be enough if the culprit is the employee holding the encryption key.
Multimaster encryption keys offer one way to plug the gap. With multimaster keys, "even if you're the DBA and know all the passwords, you still cannot retrieve the data. Only the person who manages these multimaster keys, like the CFO," has that authority, says Kaplan.
Several companies offer multimaster key solutions. Oracle's Datavault, for instance, places data into a virtual lock box. Once it goes through that area of the database, not even an admin can access it. And for compliance purposes, it ensures that data doesn't change.
Other technologies from vendors such as NetApp and Oracle keep audit trails from being altered.
Audit trails trace what information database administrators select and update as they set up a database. Of course, DBAs who are up to no good know how to cover their tracks. The newer technologies are designed to thwart such activity by preventing anyone from modifying audit trails.
What's more, most encryption systems can get pricey. "When you're looking at the cost associated with this, whether it's the time to deploy or the amount of [labor] or the actual cost in dollars of the solution -- these things are not cheap," Ouellet adds.
A less expensive way to add encryption is to use the capabilities that come built into many applications, Ouellet advises. "You'll have to pay for it, but it's needed, and as far as integration is concerned, it's not going to take an inordinate amount of time," he says.
Looking for an ultracheap approach? Ouellet suggests buying a hard drive with built-in encryption. Seagate, Toshiba and Hitachi are among the vendors introducing self-encrypting drives. "It costs only a few bucks more to buy a drive with encryption," Ouellet says. "The applications aren't even aware there's any encryption. It's all in the background at the low-level driver level."
But keep in mind that self-encrypting drives address only storage issues, Ouellet warns. "As far as the application is concerned, once it reads the data off the drive, it's in clear text -- and in a backup, it's in clear text," he says. "Only in the storage environment is it safe."
On the bright side, self-encrypting drives will be helpful down the road when you have to dispose of a drive, Ouellet adds. "I can just lose or dispose of the key that was on that drive. Then the data is gone."
On the Desktop
Data at rest now includes data on the desktop. The NIH's IT department is moving to desktop-level encryption. "Unfortunately, thefts occur inside, too," Rosen says. "Encryption is a fairly simple mechanism. The performance impact is minimal."
Children's Hospital Boston also encrypts data on the desktop says Paul Scheib, director of operations and chief information security officer. "We do laptop encryption, and we try to limit what data can be stored on local machines," he says. "We don't have a sure way to stop people from writing from a CD drive, because they do have a business need to do it. The best you can do is put policies in place and educate people."
But desktop encryption resolves only one security issue, Ouellet says. "A lot of organizations have an onion-layer approach. To be able to get onto the storage environment, you have to go through a bunch of gates and barriers," such as ID management and network firewalls, he says. "That may, in fact, be good enough -- it solves the external data problem. But your storage environment is not addressed that way."
Key Management
For years, encryption users have been calling on security and storage vendors to offer better interoperability when it comes to managing the keys that actually control the encryption. In response, companies such as Microsoft Corp. now allow users to store the encryption keys for data held on other vendors' key management systems.
But key management will become more complex, experts say, as encryption finds its way into more and more storage devices, creating an avalanche of keys to manage.
Some industry standards are being developed, such as IEEE P1619, but they address tape encryption and not the storage environment. "We're seeing that move over to the self-encrypting drive [systems], but as far as the databases are concerned, they don't quite have a standard," says Ouellet.
For now, companies such as IBM and RSA Security Inc. provide some form of key management for external services, Ouellet says.
Industry watchers say that although companies aren't clamoring for encryption and storage security, adoption will remain steady. "There's a finite amount of resources available," Rosen says. "There won't be a huge rush to it -- but with [new hardware], everything is going to be encrypted."
Collett is a Computerworld contributing writer. Contact her at stcollett@aol.com.
Don't forget bluefang's share! e/
The IETF extends its timeframe for NAC standards
http://www.networkworld.com/newsletters/vpn/2009/020909nac1.html
Network Access Control Alert By Tim Greene , Network World , 02/10/2009
The last newsletter cited in error some milestones for completion of Internet Engineering Task Force NAC standards as if they were the actual timetable.
But rather than representing the current scheduled for completion of the work, the milestones were goals set last year that have been missed, pushing back the date for reasonably expecting the standards by six months or so.
However, according to the co-chair of the IETF Network Endpoint Assessment working group Steve Hanna, the delay was caused in part because the group took the time to include at least three significant improvements to the proposed standards.
One enables NAC servers to retry handshaking with endpoints that have already been admitted to the network. The server might want to do so to recheck the state of endpoints if NAC policies change or if endpoints exhibit behavior that indicates they might have fallen out of compliance since they were admitted, Hanna says.
Related Content
The task force liked the idea, but it took several tries to get a proposal that wasn’t complex and that assured that the NAC state engine wouldn’t be susceptible to inconsistencies when the server retried endpoints, he says.
The second makes the mechanism by which endpoints pass along their health status more efficient. The initial proposal for the protocol was based on the Trusted Computing Group’s Trusted Network Connect standard. Some on the task force favored Microsoft’s version because it sent more compact messages, and so was more efficient.
The drawback of the Microsoft method is that is sends only one health message, thereby limiting the amount of information about the endpoint that can be passed along. The TNC version was more verbose, but could be extended to allow as many exchanges as needed.
A compromise version has been worked out that uses the brevity of Microsoft’s method in combination with greater extensibility to allow multiple exchanges, Hanna says.
The third improvement to the standard involves how third party software interacts with NAC. There is no standard for how, say, an antivirus vendor’s endpoint software communicates with NAC. For that to happen requires the network to also have the server-side antivirus software, Hanna says.
Individual NAC and antivirus vendors might work out ways to pass the information without the server software, but that was done case-by-case.
The current proposal includes standards for how about 15 endpoint attributes should be formatted so any NAC product that adheres to the standard can glean the information. The impact of that is if a new type of device – an iPhone for instance – were to try to join a network, if it adhered to the standard could be queried by a NAC server.
So the standards work on NAC continues beyond the timeframe that the task force initially hoped for but the result is a better proposal.
An FPGA home for device authentication?
http://www.edn.com/blog/890000689/post/1330040333.html
There are a variety of ASSPs that Verayo is developing, once the concept of challenge-response authentication gains traction. The concept might work well with the Trusted Computing Group's Trusted Platform Module, for example. If customers respond to the PUF architecture, FPGA vendors may be more willing to work with Verayo on hard macros. But in the meantime, Verayo can offer its security technology as a soft implementation method for existing FPGAs. This is interesting from a crypto and security perspective, but also from the perspective of developing new marketing channels for macros in the FPGA world.
ISACA applauds move to common disk encryption standard
http://www.snseurope.com/snslink/news/news-full.php?id=11063
Date: Friday 6 February 2009
Author: PHILIP ALSOP
ISACA has applauded moves by the data storage industry to develop a common encryption standard for use on hard drives. According to Vernon Poole, CISM Head of Business Consultancy for Sapphire and Member of ISACA's Information Security Management Committee, the development of the standard by the Trusted ComputingGroup - whose membership includes Fujitsu, Hitachi, IBM, Samsung,Seagate, Toshiba and Western - centres around three non-proprietary specifications.
"The Opal Security Subsystem Class Specification is designed for PC clients, the Enterprise Security Subsystem Class Specification is for datacentre storage, while the Storage Interface Interactions Specification focuses on the interactions between these storage devices and underlying SCSI/ATA protocols," he said.
"These three specifications come together to form a security framework that the data storage industry can use on their drives, and so allow notebook, as well as desktop, PC users to encrypt their data on-the- fly as it is written to the drive," he added.
As data is required, he went on to say, it can be decrypted directly into the computer's memory, so lessening the risk that the data will fall into the wrong hands.
"The fact that the industry has developed these specifications under the auspices of the Trusted Computing Group, is extremely positive for all aspects of the IT security industry, since it will allow companies to upgrade their computers and have a baseline on which to build an enforceable set of IT security policies," he said.
"Research from the Privacy Rights Clearinghouse (http://www.privacyrights.org) shows that, in recent years, more than 252 million records containing sensitive data have been compromised due to security breaches in the US alone. The use of encrypted hard drives would have greatly reduced this figure," he added.
Of laptop data security
Done the basics - or sleepwalking along a precipice?
http://www.theregister.co.uk/2009/02/06/laptop_data_security/
this does not protect against someone removing the hard drive. To protect against this, most current laptop operating systems have some kind of hard disk encryption mechanism built in – there’s Bit Locker for Windows Vista, for example.
Also, the Trusted Computing Group has just announced a specification for direct hard drive and USB stick encryption, which should help things even more.
Perhaps the first thing to say is: “It's nobody's fault.” We could blame the laws of physics for the current capabilities of laptops, but not those who discovered them, nor those who have successfully pushed the data storage of hard disks to terabyte capacities. Nor indeed, the people who squeezed the processing equivalent of several mainframes into the flat rectangle of electronic wizardry that we give to our mobile workers.
The downside, of course, of being able to carry the equivalent of several million copies of Encyclopedia Britannica in a briefcase, is that we can now lose, corrupt or inadvertently reveal vast quantities of information, whereas before we could only do so for relatively small quantities. It is like living in a palace after living in a shed - but of course, the shed had one door and a single room to maintain.
All the same, the risks associated with storing data on a laptop remain relatively straightforward to define. First, any piece of information will have an associated value, be it a laundry list or the recipe for Coca Cola – it only takes one slip of paper to be left on a photocopier to find out the difference. Similarly, a single spreadsheet may contain cricket scores, or indeed the pricing structures offered to different customers.
The scale of today's laptops give us increased risk – it is now far easier to store a great deal more information than before. A terabyte could easily equate to the entire repository of information for many businesses for example, and with that much space available, it is tempting to store as much as possible. This does increase the risk of having high-value information in the mix, which also raises the bar in terms of protection needed.
CIA
We can consider the threats in terms of the acronym CIA - that's:
Confidentiality - that only those who should see the information, can see the information.
Integrity - that the information cannot be changed without authorisation or knowledge.
Availability - that the information is protected against loss.
For laptop users, there are some relatively straightforward mechanisms that can be implemented to reduce the risks of each.
Both confidentiality and integrity need to be dealt with in a number of ways. The first is to ensure the information itself is protected. By far the simplest mechanism is to ensure the laptop is password protected - either at login time, or for the more security conscious, in the bios.
But this does not protect against someone removing the hard drive. To protect against this, most current laptop operating systems have some kind of hard disk encryption mechanism built in – there’s Bit Locker for Windows Vista, for example.
Also, the Trusted Computing Group has just announced a specification for direct hard drive and USB stick encryption, which should help things even more.
Trainspotting
You don't have to be an expert to extract information from a laptop, if the person in front of it insists on showing it to all and sundry. On trains, in planes and in cafés, there have been countless, quite flabbergasting occurrences of business executives showing off their corporate secrets, in spreadsheets or slide decks.
It would be funny if it wasn’t so frequent – perhaps it is the ultimate demonstration of the belief that security breaches only happen to other people (the best example I can remember was a loud-mouthed senior exec of an systems integrator explaining to a colleague – and indeed the rest of the carriage – how to interpret next year’s competitive analysis spreadsheet).
It's not just the 'data at rest' that needs protecting, but also 'data in motion' - as we describe in another article for example, rogue Wi-Fi hotspots can be capturing information from unsuspecting users. Surprisingly perhaps, individuals do not always use the basic protections available to them - using secure channels to access their email servers, for example. For larger organisations, SSL VPN is another mechanism to protect against this threat – not only do such encrypted links give secure access to corporate systems, but this also means mobile workers will be using corporate protections when they access the Internet.
Data leakage protection (DLP) deserves a mention here, as a technology which will monitor what's being sent through a corporate firewall and block anything that looks suspect. We need to remember that security breaches can be as much down to stupidity as malice, and also that a laptop user may well be accessing the internet directly rather than via a VPN.
An information leak may be quite simply a case of attaching the wrong file to an email, or sending it to the wrong person - who hasn't inadvertently used the 'autocomplete' feature in their email client, and sent a document off to the wrong 'Sarah' or the wrong 'Graham'? As individuals we should question how much we need such features in the first place, and whether they are worth the risk.
And lets not forget the ever-so-obvious topics of anti-virus, personal firewalls and so on. Just because there isn't currently a big scandal about computer worms hacking information off hard drives and posting it on the Internet, that's probably just because the hackers haven't got around to it yet. Your McAfee or Norton may be up to date, but when did you last patch your operating system and applications?
Lastly, we have availability. This can be dressed up in all kinds of ways but in its most simple form it equates to being confident that the information we had yesterday will still be there tomorrow. The laptop’s biggest strength is also its greatest weakness in this respect - that of portability. It is quite possible to lose every last bit of information one has, just by leaving it on the bus. Equally, only the most resilient of laptops can resist the effects of knocking a glass of water over the keyboard; note also that most hard drives are mechanical - marvels maybe, but prone to damage.
The answer is backup - which can be as straightforward as taking a copy of important data on a USB stick and stowing it somewhere sensible (USB sticks can be a solution as well as a problem – but see confidentiality, above). Mobile workers don't always have access to corporate systems, which means they are not always going to be supported by corporate backup mechanisms; equally, offline access can result in storing more information than strictly necessary on the local drive. Using a laptop without doing personal backups is like driving without a safety belt, in the vain hope that accidents only happen to other people.
In conclusion then, there may be corporate mechanisms in place for data security, but these do not always extend out to laptop users. There’s always more that we can do, but those who do not follow the basics are sleepwalking along a precipice.
IETF standards for NAC in the final stages
The two NAC protocols up for approval: posture assessment protocol and posture broker protocol
http://www.networkworld.com/newsletters/vpn/2009/020209nac2.html?hpg1=bn
Internet Engineering Task Force standards for NAC are in their final stages, according to IETF postings.
A working group of the IETF has been considering standards for two protocols needed to support interoperability between agents that gather endpoint data and the server that decides whether the posture of the endpoint meets NAC policies.
With a standard in place, vendors can build their agents and servers so they can talk to agents and servers that likewise conform to the standards.
The two protocols – known as posture assessment protocol and posture broker protocol – have been forwarded to the Internet Engineering Steering Group for final review. Under IETF rules, the IESG determines “in a timely fashion” whether to approve proposed standards that are submitted to it.
Mobile Security: Is Anyone Listening?
http://www.newsfactor.com/story.xhtml?story_id=010000BR3KPE&page=1
Mobile computing has become one of the most difficult areas of security to manage given the complexity of today's information systems. Many people have enough trouble securing their immobile systems. Throw hundreds if not thousands of more devices into the mix, and what's an IT security professional to do? Electronic data has sprouted legs.
Mobile computing has become a cornerstone of business productivity. All of the conveniences and benefits associated with mobile computing are obvious to practically everyone. Envision a world without wireless Internet, smart phones and remote access. It is hard to imagine how we could get by without it.
Now that's the rose-colored glasses perspective -- but there is a dark side to mobile computing that very few in business want to talk about or address. It is the flip side to all of those conveniences and benefits: the threats lurking, awaiting their turn to exploit the weaknesses inherent in every mobile device.
Protecting Data-at-Rest with Next-Generation Encryption Technology: Making a difficult task look easy
By Joseph Belsanti, Vice President - Marketing, WinMagic Inc.
http://advice.cio.com/protecting_data_at_rest_with_next_generation_encryption_technology_making_a_difficult_task_look_easy?page=0%2C0
With Privacy Rights Clearinghouse(1) reporting over 251,164,141 instances of compromised data records since 2005, clearly today’s highly-portable computing environments not only increase productivity, but also make it extremely difficult to protect personal identifiable information and other sensitive data. The fact is that over half of corporate data resides on mobile endpoint devices in some sort of redundant or original format.
With more data residing on portable devices such as laptops and removable media (USB storage devices, CD/DVDs, etc.), unintentional and intentional security breaches are becoming commonplace. As a result, eliminating data theft and its damaging consequences is now a top priority for all organizations. But, how is this best accomplished?
Sector-by-sector, full-disk encryption is the recognized industry best practise and most effective method for protecting data-at-rest stored on hard drives and removable media. Unlike file encryption, it encrypts all stored data, including file names and associated metadata, rendering them “invisible” to unauthorized users.
Full-disk encryption has three main components: A pre-boot authentication methodology, with single- or multiple-factor authentication (password, USB token, smartcard, biometrics, PKI and/or TPM); an AES encryption engine; and a management server.
International privacy initiatives are helping drive the demand for strong user authentication. Government, finance, and healthcare sectors are increasingly utilizing multiple-factor authentication, consisting of combinations employing passwords, smartcards, USB tokens, biometrics, PKI and TPM right at pre-boot – often leveraging the same technology to make it simple for users to authenticate themselves for both physical and network access with a single device. This holistic approach to security is far easier for the user as only one device/password is required. As a result, organizations are increasingly looking for encryption software that is simple to integrate with authentication technology at pre-boot – making the encryption process transparent to the user.
Due to the increased demand for data security, AES encryption has reached a commoditization state, and is increasingly bundled within commercially-available operating systems and hardware. Today’s encryption software must also be able to seamlessly integrate with these technologies.
Without the third component, the management server, large enterprises would find it virtually impossible to configure, deploy and manage data-at-rest security for large numbers of users/user groups. In addition, it would be impossible to effectively mange the heterogeneality of the enterprise network with MAC encrypted computers, hardware-based encrypted hard drives and software-based encrypted hard drives. Ideally, the management of user groups can be synchronized with LDAP servers, such as MS Active Directory in order to shared encryption keys, eliminate passwords, and provide a more positive experience for the end user through seamless access to data for authorized users.
Additionally, management servers should enable organizations to label encryption keys in human readable text to make it easy to identify/access encrypted archive data stores over time, and also should provide central management and dynamic provisioning of encryption keys so that secure data access can be extended to clients/consultants.
As a result, organizations should look for a full-disk encryption solution that makes it simple to comply with all international privacy and security regulations by protecting all data-at-rest on endpoint devices, and removable media – without sacrificing administrator or end-user productivity.
The selected full-disk encryption software should not only provide a highly-secure AES encryption engine, but also the versatility and ease of use to enable organizations to tailor data protection to meet both security and productivity requirements. Whether bundled with hardware-based solutions, such as Seagate’s Momentus 5400 FDE.2 self-encrypting hard drive, or added as a stand-alone software security component, the encryption software should enable organizations to seamlessly integrate full-disk encryption with other technologies,
such as multiple-factor authentication, and also makes it simple to extend end-point security to include increasingly-popular removable media.
Widely dispersed computing environments have made it more difficult to protect data-at-rest. Successful encryption must be able to seamlessly integrate with existing applications, such as multiple-factor authentication, at pre-boot; must be able to centrally-manage multiple encryption schemas and their associated keys; and must be able to provide encryption for all data on all devices, including removable media. Thanks to innovative improvements in security, versatility and ease of use, next-generation encryption solutions now make this difficult task look easy.
FDE is now "SED":
http://www.theregister.co.uk/2009/02/03/seagate_2tb_constellation/
The new Seagate drive also has full disk encryption, now re-named SED for Self-Encrypting Drive, so forget the FDE (Full Disk Encryption) term.
Seagate sets Constellation above Barracuda
Matches Western Digital
By Chris Mellor
Posted in Storage, 3rd February 2009 12:00 GMT
Free whitepaper: Calculating total power requirements for data centers
Seagate has launched its much-expected 2TB drive under the Constellation brand, which includes 2.5-inch small form factor (SFF) drives, for nearline storage needs.
The Constellation ES 3.5-inch drive replaces the Barracuda ES and comes in 500GB, 1TB and 2TB capacities with 500GB platters and SATA and SAS interfaces. It spins at 7200rpm with user and OEM-selectable boundaries for rpm reductions. Seagate brands this Power Choice and provides four levels:-1 - Up and spinning at 7200rpm
2 - Head parked away from the platters
3 - Partial spin down
4 - Drive stops spinning
The drive can be set to enter the various power-saving levels after or for definable periods. Xyratex is quoted by Seagate as making approving noises about the drive which will ship in Q3. The Barracuda desktop drive remains available.
Western Digital announced (http://www.reghardware.co.uk/2009/01/27/review_internal_hard_drive_wd_caviar_green_2tb/) its 2TB Caviar Green SATA drive, spinning at up to 5400rpm, just a few days ago, ceding the 2TB, 7200rpm and SAS 2.0 slot to Seagate which can say its drive has a faster data response and delivery time while still offering settable power-saving levels.
The new Seagate drive also has full disk encryption, now re-named SED for Self-Encrypting Drive, so forget the FDE (Full Disk Encryption) term.
The Constellation is a 2.5-inch drive in 160GB or 500GB capacities and with 3Gbit/s SATA and 6Gbit's SAS 2 interfaces, and it has the SED and Power Choice features. Seagate says this is the first nearline SAS 2 drive and Dell will ship arrays using it. The nearline status marks its differentiation from the Savvio enterprise performance SFF drive range and its data centre use separates it from Seagate's Momentus lap top drive range. The SFF Constellation will ship this quarter.
Seagate is positioning the Constellation products as data centre storage array drives. Assuming that the Barracuda quality problems are sorted then we can expect array vendors to start shipping products using the drives from Q2 for the 2.5-inch model and from Q3 or 4 for the 3.5-inch one. ®
VA settlement Demonstrates Just How Costly
Lax Security Can Be
http://gcn.com/articles/2009/02/02/va-data-breach-suit-settlement.aspx?s=gcndaily_030209
If you want another good reason to make sure your sensitive data is adequately locked down, look no farther than the Veterans Affairs Department, which last week agreed to pay $20 million to settle a class action lawsuit over the 2006 loss of a laptop containing records with personal information about up to 26.5 million veterans and active duty personnel.
That’s a lot of money, and it will be paid from taxpayers’ dollars, but VA got off lucky. The suit originally asked for $1,000 for each person whose data was exposed, which could have been more than $26 billion. That’s nearly enough to bail out a good-sized bank.
The settlement demonstrates that the repercussions of exposing data can be long-lasting and that the cost can go far beyond the immediate expense of cleaning up the breach. For companies it has long been known that negative publicity resulting from public notification of a data breach can quickly translate into millions of dollars of lost shareholder value as stock prices tumble. Agencies do not have to worry about stock prices, but the threat of other costs is real. The VA agreed to the settlement even though the department has said there is no evidence that the information on the stolen laptop was used or than any person involved was harmed by it.
Lesson: It could be a lot cheaper to secure your data in the first place than to pay for damage control later.
To its credit, the VA generally has responded well to this incident despite an initial three-week delay after the theft was reported before possible victims were alerted back. Since then the department has gotten serious about improving protections on data and has been a major user of Microsoft’s Rights Management Services, which places controls on the use of documents. Security still is not perfect, but it is a huge department with hundreds of facilities and offices located around the country offering a multitude of services, so it is gong to take a long time to get everything under control. But the department did the right thing in stepping up and taking responsibility for the huge 2006 loss and agreeing to the payout, even if it does hurt the taxpayers.
The lawsuit was filed in U.S. District Court in Washington by five veterans groups in June 2006, a month after news was released of the theft of a laptop on which a VA data analyst had loaded the data. The laptop was recovered with the data apparently intact. But it is impossible say with absolute certainty that the data was not accessed and copied. Millions of persons whose names, birthdates and Social Security numbers were in the data were put to the trouble of monitoring their credit and worrying about data theft.
The settlement calls for payments of from $75 to $1,500 to persons who can show some harm resulting from the incident, which could include physical symptoms of stress or expenses for credit monitoring. Any money left over from the $20 million fund will be donated to veterans’ charities.
Let’s hope that few of the veterans whose data was exposed in the incident were badly harmed by it and that actual payouts of damages will be small. The upside of the incident could prove to be twofold: A sizeable chunk of money could go to deserving charities, and a lesson will have been learned about the value of preventing a breach rather than responding to it after the fact.
Needham Sees March or April Rebound in Disk Drives
Posted by Tiernan Ray
http://blogs.barrons.com/techtraderdaily/2009/02/02/needham-sees-march-or-april-rebound-in-disk-drives/
Needham & Co. analyst Richard Kugele today writes that we are “nearing the trough for negative data points in hard disk drive land.” He believes stabilization of demand for drives will become apparent “over the next couple weeks,” leading to trough valuations for the group, whereupon you can start to buy the stocks.
In particular, drive inventory industry-wide is at about 2 to 2.5 weeks, he thinks, below the 4 weeks of inventory Seagate (STX), the largest vendor, indicated when it reported earnings on January 21. That’s the lowest level Kugele has ever heard of, he says, and his sources tell him capacity is running short for every model of drive. Drive price increases from Seagate and Western Digital (WDC). Kugele sees a rebound for the entire hard drive group in March or April. He also thinks Fuji will get out of the business of supplying Seagate and others with the materials incorporated in drives.
Today, Seagate stock is up 15 cents, or 4%, at $3.95, while Western Digital shares are up 84 cents, or 5.7%, at $15.53.
cooler, check out the chatter:
http://www.wavesys.com/news/spotlight/FDE_media.asp
FM
Maybe this will spur the government!!!
Feds allege plot to destroy Fannie Mae data
http://www.breitbart.com/article.php?id=D961I79O0&show_article=1
Jan 30 10:54 AM US/Eastern
94 Comments
URBANA, Md. (AP) - The Justice Department says it foiled a plot by a fired Fannie Mae contract worker in Maryland to destroy all the data on the mortgage giant's 4,000 computer servers nationwide.
The U.S. Attorney's Office says 35-year-old Rajendrasinh Makwana, of Glen Allen, Va., is scheduled for arraignment Friday in U.S. District Court in Baltimore on one count of computer intrusion.
U.S. Attorney Rod Rosenstein says Makwana was fired Oct. 24.
Rosenstein says that on that day, Makwana programmed a computer with a malicious code that was set to spread throughout the Fannie Mae network and destroy all data this Saturday.
Makwana's federal public defender did not immediately return a call seeking comment.
Washington-based Fannie Mae is the largest U.S. mortgage finance company.