Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
HP Ready For NAC Fight
http://www.enterprisenetworkingplanet.com/news/article.php/3742476
Much of the network access control (NAC) hype over the last few years has involved network defense. For Mauricio Sanchez, chief security architect, ProCurve Networking by Hewlett-Packard (NYSE: HPQ), NAC can be used as an offensive tool as well.
The HP ProCurve offensive playbook for NAC comes as HP embraces Microsoft's NAP technology with the ProCurve Identity Driven Manager (IDM) policy management tool. Microsoft's NAP has the potential to drive NAC adoption even further into the enterprise mainstream now that Server 2008 is generally available.
"Like any good sports team you need a good offense and a good defense to win the game, and from a security perspective we feel that our approach should be the same," Mauricio Sanchez, chief security architect, ProCurve Networking by HP, told InternetNews.com. "On the offensive side, the first layer is around Network Access Control, this is where the network interrogates identity and the health state of users and devices," he said, adding that the term NAC means different things to a lot of people.
"To us and to me, NAC is more of a solution architecture based on performing some kind of access control when users connect to a network," Sanchez explained. "So it's not about a particular product or technology."
According to Sanchez, NAC is also about products and technology that convey the idea that network access should be limited and that people should be asked some questions before they are permitted to connect.
Sanchez noted that once you get past the offensive layer, with user and system interrogation, it's important to have defensive layers to address real time threats against the network and to protect against failures in the offensive layer.
He says HP will be on the offensive layer of NAC by integrating Microsoft's NAP with HP's Identity Driven Manager (IDM) application.
NAP is an integrated component of Windows Server 2008, which was launched earlier this year.
NAP provides built-in health capabilities to verify endpoint health as devices come onto the network. It also provides "a nice baseline for us to leverage as a network vendor and take advantage of it." Sanchez commented
HP's IDM, meanwhile, allows administrators to define access policy based on user group information, time of day and location -- all by way of an easy-to-use GUI.
Though Microsoft NAP has been officially available only for a few months, it already has a lot of backers. More than a year ago, Microsoft claimed it had more than 100 vendors lined up to support and interoperate with NAP.
Sanchez noted that HP is looking at NAC from a comprehensive network framework perspective, which is a distinct advantage over a pure play NAC vendor. In Sanchez's view pure play NAC solutions are a dead end.
Another key attribute for NAC success is interoperability, something the Trusted Computing Group's Trusted Network Connect (TNC) aims to achieve.
Sanchez is a chair on the TNC working group, where both HP and Microsoft are contributors. Last year Microsoft announced that it would work toward TNC interoperability with NAP.
Technically the interoperability involves TNC support for a Microsoft NAP approach called Microsoft Statement of Health Protocol. The IF-TNCCS-SOH (TNC client server - statement of health) protocol acts as a transport to help validate that an end point meets the security requirements.
Leveraging TNC
A year later, IF-TNCCS-SOH is still not yet ready for . Sanchez noted that HP's interoperability for NAP does not come by way of TNC at this point, but rather by way of Microsoft APIs for Server 2008.
Other networking vendors such as Juniper Networks (NASDAQ: JNPR) have already pledged to implement TNCCS-SOH when available.
"The Juniper Networks Infranet Controller, the policy management server at the heart of Juniper Networks Unified Access Control (UAC), will be able to leverage the TNC standard IF-TNCCS-SOH protocol," Rich Campagna, senior product manager, Juniper Networks told InternetNews.com.
"Juniper Networks UAC is expected to support this new TNC standard in the first half of 2008," Campagna said, adding that at Interop Las Vegas 2007, the company showed a preliminary prototype of this technology.
HP's Sanchez argued that the differences between what is available with Windows Server 2008 today and what IF-TNCCS-SOH will provide are "99 percent the same thing." The differences according to Sanchez are minor bug fixes and improvements.
Overall, though, Sanchez noted that the TNC does matter and customers appreciate open standards and the ability to choose from a vendor set that supports those standards.
That said, work needs to be done with TNC to make it more effective..
"One of the things we're working on is a compliance program to verify that vendors are adhering to the standards, and I believe that will relieve a lot of the deployment headaches that people face today," Sanchez said.
He added that today many interoperability difficulties exist among vendors who claim to be open standards based.
Interop Labs test results: Microsoft gets it NAC act together
http://www.networkworld.com/research/2008/042308-ilabs-nac.html?ts0hb=&story=ac1_interop
Microsoft edges ahead of NAC rivals, particularly on the client side... four pages/good read.
Aruba Emphasizes Preparedness In New Products and Enhancements
Enterprise WLAN sales will surge over the next three years, triggered by 802.11n ratification and workforce mobilization. As vendors reposition to ride this next wireless wave, Aruba Networks is betting that extensibility and preparedness will be key differentiators.
"Our newest APs address an issue many customers now face," says Michael Tennefoss, Aruba's head of strategic marketing. "They don't necessarily need 802.11n today, but they have to use or lose this year's budget. We're offering n-ready APs at a price that fits into existing budgets, but can be simply upgraded by downloadable keys in the future."
Aruba's RFprotect Wireless Intrusion Prevention System offers customers another way to prepare for tomorrow. "Security is an unending, iterative process in which the best defense is built by rapidly integrating updates about real or potential attacks," said project manager Rajeev Shah. "We're pleased to be the first to make [user-defined threat definitions] available for WIPS."
Facilitating growth
Two more announcements—a Remote Access Point (RAP) mobility upgrade and AirWave Management 802.11n extensions—round out Aruba's "get ready to rumble" campaign. Bigger, faster WLANs are on the horizon, but companies that deploy them are in for growing pains. With these announcements, Aruba is trying to help its customers grow more gracefully.
For starters, Aruba capitalized on the modular design of its 802.11n draft 2.0-compliant APs to lower the bar for customers who plan to upgrade to n.
"Our existing 802.11n APs list for $1295," explained Tennefoss. "Our new AP-124ABG and AP-125ABG APs will list for $995. The price of a downloadable key to unlock 802.11n operation will vary by volume, but if you're upgrading 500 APs, each key would cost $300." In other words, customers can use Aruba's new AP-124ABG (detachable antennas) or AP-125ABG (integrated antennas) to defer one fourth of their new hardware investment.
Even when used in 802.11a/b/g mode, these 3x3 MIMO APs will deliver improvements like better sensitivity and TPM secure key storage. When customers are ready, they can then unlock 802.11n without increasing power consumption. "Our APs offer full 802.11n performance with full power output and encryption over a single [802.3af] PoE port," said Tennefoss. "Many other APs have drastically higher power consumption that requires infrastructure upgrades."
Preparing for the unknown
The next version of RFprotect, available early summer, has also been modified to promote future extension.
"New attacks require intrusion detection vendors to create attack signatures, test them, and then issue new releases. In the wireless world, this has been done by each vendor. [In this update], we're giving users the ability to create their own new signatures with a scripting language," explained Tennefoss.
Aruba hopes that public dissemination will reduce the time between threat discovery and mitigation. In December 2005, Network Chemistry established the Wireless Vulnerabilities and Exploits database, a public initiative to increase vulnerability awareness. Aruba acquired the WVE along with Network Chemistry's security assets last year. This month, several 802.11n vulnerabilities were submitted; more are likely with an update as complex as 802.11n. According to Tennefoss, Aruba will submit signatures to WVE, but hopes that others throughout the community will make contributions as well.
Managing expansion
As WLANs grow, enterprises must contain the spiraling cost of maintenance and monitoring. Aruba tackled that challenge last year by acquiring AirWave, maker of the popular AirWave Wireless Management Suite.
"The AirWave acquisition is the smoothest that I've been involved with," said Tennefoss. "This is because we weren't trying to assimilate them into the Borg. Rather, we wanted to learn from them and let them continue to do what they do so well. We didn't try to change AirWave to fit into Aruba -- we changed Aruba to integrate with AirWave."
The upcoming 6.0 release includes 802.11n migration management features and more extensive help desk and diagnostic support -- features important to expansion. Tennefoss stressed that 6.0 supports more than Aruba's 802.11n products. "Some customers were concerned that we would drop AirWave support for other vendors. That's not our intention at all, and now we've put our money where our mouth was at the time of the acquisition by continuing to provide multi-vendor management," he said.
Moving beyond Wi-Fi
Finally, Aruba announced a mobility enhancement to the Remote Access Point (RAP) released last year.
"You can take an Aruba AP-70 and download RAP software to turn it into a combo VPN replacement and mobile work platform," explained Tennefoss. "Once I have RAP software loaded, I can plug that AP in anywhere--it automatically goes out on the Internet to find my Mobility Controller and download all of the same policies that apply to me back at the office."
The mobile RAP release, available at no extra charge to licensed RAP module customers, adds 3G cellular modem support. The mobile RAP's objective is to securely deliver enterprise applications to teleworkers and mobile users. In particular, 3G is targeted at organizations with highly mobile workers, temporary secure access needs, first responder, or disaster recovery requirements -- anywhere that a RAP could be used but Ethernet is not available. If both 3G cellular and Ethernet are available, the RAP selects the highest speed WAN link. "This is a completely plug and play solution," said Tennefoss. "The IT department configures the unit and users just turn it on."
Aruba is testing the mobile RAP for compatibility with Verizon, AT&T Wireless, and Sprint EV-DO and HSDPA services. It also plans to test with 3G providers in Asia and the European Union. Like the other upgrades announced this week, Aruba's Mobile RAP will make its public debut next week at the Interop Las Vegas.
Lisa Phifer owns Core Competence, a consulting firm focused on business use of emerging network and security technologies. An avid fan of wireless, Lisa has written, taught, and consulted about Wi-Fi security since 2001, and participated in the WVE since its inception.
okay TPT.........
You've piqued my interest... what the heck is Helgasincock?Since it is "OT" I'm gonna have to delete my question and your answer.... so answer quickly!! LOL and thanks
Jon Oltsik: Microsoft's 'chain of trust'
http://www.news.com/8301-10784_3-9924798-7.html
It's been a few weeks since the RSA Conference 2008 and I'm now preparing for Interop. Nevertheless, I wanted to get in my two cents worth regarding Craig Mundie's RSA keynote address on what Microsoft is calling "End to End Trust."
End to End Trust? What about the often-discussed Trustworthy Computing initiative that Microsoft introduced in 2001? It's still around but Microsoft realized that Trustworthy Computing alone may not be enough. So what else is needed? Craig Mundie mentioned:
• 1. A chain of trust. As the old security saying goes, "the security chain is only as strong as its weakest link." Microsoft has done a good job making Windows more secure with each iteration but it really doesn't matter if the bad guys compromise your data by hacking in at the application layer. Microsoft is suggesting a model where the entire technology stack must adhere to a trust relationship (i.e., each piece is authenticated and validated and all changes must be approved) where every component relies on the others.
• 2. A new identity model. Identity is no longer about user name and password alone. In today's computing environment, you also have to consider device type (i.e., am I communicating via my PC, cell phone, or PDA?), location, and the user's work and personal profile. Yes, this complicates things but there is no getting around the fact that I use the same laptop to do my job during the day and then bid on vintage Gretsch guitars at night.
• 3. Industry participation. Microsoft readily admits that it can't establish end-to-end trust on its own. Of course, Microsoft won't be shy about suggesting technologies for connectivity and standardization, but it really does need help here. It's time that the security industry stop its outright mistrust of Microsoft and extend an olive branch.
In my view, Mundie's keynote was effective in that it really got the industry's attention. Many security professionals and vendors recognize the need for this End to End Trust model while organizations like the Computer Security Institute (CSI), the National Institute of Standards (NIST), and the Trusted Computing Group (TCG) are already working on similar efforts.
In past years, Microsoft keynotes were full of product demonstrations and funny video montages. Its End to End Trust discussions demonstrate a new Microsoft focus--and the remaining problems associated with information security.
Medium Businesses in the US to Invest US$7.8B on Storage & Security
—Up to 77% of medium-sized businesses reported a hard drive failure in the last 12 months, says AMI—
NEW YORK--(BUSINESS WIRE)--Medium-sized businesses (MBs or companies with 100-999 employees) in the United States are on track to invest up to US$7.8 billion on data storage and security this year, up 12% over 2007. This steady growth is due to the need to consistently update and guard against security breaches and the growing volume of employee/client data, according to the latest study by New York based Access Markets International (AMI) Partners Inc.
“Up to 86% of US MBs reported a security breach/data loss in the last 12 months,” says Nichelle McKenzie, New York based research analyst at AMI-Partners. “The cost to these companies is about US$7,000 per year.”
When handling sensitive data, security and storage are top of mind concerns among US-based MBs. The AMI study found that storage and security investments account for 9% (about US$7.8 billion) of total IT/telecom spending. “US MBs are primarily investing in software to backup PCs and servers,” Ms. McKenzie says. “Up to 90% of MBs stated they use data backup and data recovery software, and 30% are sticking with the familiar by planning to use DAS (direct attached storage) in the next 12 months.”
Preventing hard-drive failures and electronic attacks while having an adequate storage solution are the main goals for US MBs. Deploying in-house or hosted data backup and disaster recovery solutions is seen as a top IT-related issue by almost 70% of MBs. Nearly all MBs said they backed up their data on LAN servers. This means MBs are actively competing to remain lead software and hardware IT adopters.
“The preferred on-site server data backup solution is tape-based, according to 89% of MBs,” Ms McKenzie says. “An average MB deploys a storage capacity of 988 GB. MBs are looking to also invest in more advanced, greater volume storage products.”
The average spending on SAN (server attached storage) solutions by US MBs is about US$4,900 with 20% purchasing this technology in the last 12 months. MBs tend go to their trusted channel partners/VARs/IT consultants for storage related solutions, the AMI study found.
In terms of security related strategies, about 85% of MBs said enhancing IT security and privacy is the top IT strategy for the next 12 months. Thus, awareness of security products is the top factor driving security investments for this year.
“MBs need to stay abreast of new and/or advanced software and hardware that will help them to remain competitive and compliant,” Ms. McKenzie says. “Almost 33% of MBs agree that remaining compliant is the prime reason for them to invest and consistently upgrade to more advanced software and hardware for security and storage. As high as 77% of US MBs reported a hard drive failure in the last 12 months.”
Deploying a VPN (virtual private network) is an added measure to ensure that an MB will have a secure IT infrastructure; just over half of MBs polled said they plan to implement this in the coming months. Most MBs said that their data is adequately protected from electronic threats as almost all MBs use antivirus, anti-spam and anti-spyware solutions.
US MB companies use a variety of firewall devices which includes software-based firewalls running on PCs and network servers, firewalls embedded in networking hardware and firewall appliances plugged into the network. Intrusion detection, encryption and security management are top technologies for these MBs to implement in the next 12 months, the study revealed. The most popular brand for security related products was Cisco, and for storage related products Dell is the top manufacturer purchased.
About the Study
AMI’s 2007 US Medium Business Overview and Comprehensive Market Opportunity Assessment study highlights these and other major trends in the context of current/planned IT, Internet and communications usage and spending. Products and services covered include established and emerging hardware, software, applications and business process solutions. Based on AMI’s annual surveys of MBs across U.S., the study track a broad spectrum of issues pertaining to budgets, purchase behaviors, decision influencers, channel preferences, outsourcing, service and support. Also covered are detailed firmographics and critically important technology attitudes and strategic planning priorities. This data points to key opportunities and messaging hot buttons for vendors and service providers seeking to match their offerings to MB market requirements.
Mobile Device Management Services, a $20 Billion Opportunity by 2013
NEW YORK--(BUSINESS WIRE)--April 18, 2008--
A very complex mobile value chain and a growing business reliance on mobile
products have created the need for services that help businesses maximize the
value of their mobile investments. In a recent report from ABI Research,
mobile device management (MDM) services are forecast to grow from $583 million
in 2007 to over $20 billion by 2013, for a compound annual growth rate of 80%.
Mobile device management services include policy development, procurement
and asset management, billing audit and reconciliation, enhanced customer
care, device/content security, and additional services that are vertical- and
occupation-specific. According to principal analyst Dan Shey, "The range of
services needed to manage a business's mobile investments requires inputs from
many different wireless equipment and services providers including operators,
MDM platform vendors, IT services providers, telecom expense management firms,
and mobility management services firms. All want to be part of the action."
But the biggest disruptors for the market have yet to make their mark on the
industry. Mobile device management platforms are slowly evolving with more and
more capabilities for mobile device management services. Their increasing
functionality on a growing base of device platforms automates many important
mobile device management processes. Not only can these platforms be employed
by many value chain suppliers, but more importantly, businesses can acquire
and use these platforms themselves.
Shey concludes, "Operators are the dark horses in this race: they provide
and appropriate revenues for nearly all mobile capabilities, they already use
MDM platforms, and they value the mobile business customer highly. MDM
platforms, and the possible participation scenarios of the mobile operator,
beg the question: 'Who will manage?'"
ABI Research's "Mobile Device Management for Business"
(http://www.abiresearch.com/products/market_research/
Mobile_Device_Management_for_Business), provides a comprehensive competitive
analysis of the mobile device management market. Critical components include a
review of key value chain players, business decision-maker buying criteria,
and the complimentary and substitutive effects that mobile device management
platforms will have on mobility management services business. Examination of
demand drivers and inhibitors underpins growth forecasts for customer
adoption, revenues, ARPU's and penetration by worldwide region and size of
business. Concluding the report are prognoses for change in the competitive
environment and recommendations for value chain players.
.
Can someone explain this?
http://www.macworld.com/article/133051/2008/04/psystar_gaming.html?t=109
Psystar, according to information posted on its Web site, is leveraging the work of the OSx86 Project in order to get its PC to function with Mac OS X. The OSx86 Project first came to the fore after WWDC 2005, and it’s had success finding ways to work around Apple’s hardware requirements for Mac OS X, such as looking for the “Trusted Platform Module” (TPM), which ordinarily prevents OS X from running on non-Apple hardware.
Interest in Psystar shows market for gaming Mac
by Peter Cohen, Macworld.com
Apr 17, 2008 11:11 am3 Comments 4
All this hullaballoo this past week over Psystar and its supposed PC- based Mac clone can lead to a number of different conclusions about the state of the Mac market. But to the game-minded, the interest in OS X-equipped PC underscores the fact that there’s a hole in Apple’s product line worth addressing: a mid-range computer that could well help fill the void that Mac gamers have felt for years now.
Let’s rewind the clock back to 2005, when Steve Jobs first announced Apple’s plans to migrate to an Intel microprocessor architecture during his keynote address to attendees of Apple’s Worldwide Developers Conference. In the days and weeks after that event, developers discovered that Apple’s test machines were little more than PC motherboards equipped with software to run an Intel-native build of Mac OS X.
Since then, interest has run high in technical circles over the feasibility of creating a “Hackintosh,” if you will—a PC, built from commodity parts, that can run Mac OS X. The high cost of Apple hardware has long been seen as an impediment to consumer adoption of the Mac, especially among hobbyists, many of whom like to use their computers to play games in their spare time.
Apple’s rising marketshare notwithstanding, some believe it would rise even faster if the Mac hardware was an open architecture—if you could build your own box to run Mac OS X on, just like you can with Windows.
Psystar, according to information posted on its Web site, is leveraging the work of the OSx86 Project in order to get its PC to function with Mac OS X. The OSx86 Project first came to the fore after WWDC 2005, and it’s had success finding ways to work around Apple’s hardware requirements for Mac OS X, such as looking for the “Trusted Platform Module” (TPM), which ordinarily prevents OS X from running on non-Apple hardware.
Inside baseball regarding Psystar’s clone program aside, the company’s efforts have received an enormous amount of attention from Mac users this past week. And a lot of it has to do with the interest of users who would love to have a Mac, but aren’t willing to be subjected to Apple’s hardware restrictions.
Don’t get me wrong—I think Apple makes beautiful boxes. But hobbyists who play games have a point when they say that the Mac is expensive. I can put together a considerably more powerful Windows gaming machine for 30 to 40 percent less than I can when paying Apple’s prices. And I can do it using components I wouldn’t be able to use in a Mac, such as third-party Nvidia and ATI-based graphics cards that aren’t available on the Mac, different hard drives and RAM components, different microprocessors, and more.
In any case, to see Apple tap into this market would require a lot more than just the mythical midrange Mac minitower my colleague Dan Frakes has long dreamt of. It would require a fundamental shift in Apple’s product development strategy, away from keeping hardware and software in lockstep with one another, and toward a more open environment where the operating system runs on a basic hardware specification that the user can vary dramatically based on their needs.
For a variety of technical and cultural reasons, I don’t see that happening at Apple any time soon. But as a gamer, I can fantasize about it, for sure.
Wave's presentation................
A Guide to the TCG Demonstrations
RSA Conference 2008
Wave Systems Corp.
The Connected Enterprise
Data Protection:
Trusted Drives, such as Seagate Momentus FDE.2 with an onboard security controller and embedded capabilities for media-speed full disk encryption and pre-boot authentication can be activated using the Wave’s EMBASSY® Trust Suite with Trusted Drive Manager to protect against data loss due to a lost or stolen PC.
Strong Authentication:
Embedded TPM Technology puts a hardware root-of-trust in each individual TPM-enabled PC. The TPM root-of-trust enables the strongest level of authentication into any enterprise domain. Wave’s EMBASSY® Trust Suite will allow the client computer to enable the TPM with strong authentication solutions.
Centralized Management of Trusted Platforms:
Wave’s EMBASSY® Remote Administration Server enables centralized IT Administration to remotely deploy and manage PC clients that are equipped with Trusted Platform Modules (TPM) and/or Seagate Momentus® FDE.2 Drives.
DESCRIPTION
Wave Systems Corp. will be providing a hands-on workshop to do a walk-through of client and server solutions for a Full Disk Encryption hard drive based on functionality under development in the TCG Storage Workgroup and a Network security solution for secure domain login based upon the TCG standards for the Trusted Platform Module. The TCG Workshop will cover:
Remote administration of Seagate Momentus fully encrypting drives for management, compliance, and administration of the hardware-based full disk encryption.
Centralized control of distributed TPM‘s in systems to provision identities and authorization from Active Directory.
Remote enablement, ownership and management of TPMs.
The participant will experience:
Secure and Convenient user authentication data protection using a Trusted Drive and Windows Login
Drive password recovery
TPM management through remote console
Network Security using Strong Authentication with TPM-based VPN, 802.1x to network, and multifactor including biometrics.
Getting Secure @ RSA 2008…thoughts from San Francisco
April 17th, 2008 by Brian Berger
Welcome back to the TCG blog! In addition to celebrating its fifth year, TCG also has had a busy week in San Francisco, one our favorite cities, at the RSA 2008 Conference. As evidenced by the apparently added-on exhibit areas, packed food vendors and jam-packed sessions, this event apparently is continuing its role as one of the bigger IT events.
As far as TCG…we are marking our fourth year of a fairly significant presence. When we first started with RSA, we offered to host a session on what used to be a “dead” Monday. We were bringing in more chairs that year and have continued to offer a standing room-only session the show now calls “pre-conference” content…Attendees coming in on Sunday now can fill Monday with actual work rather than sightseeing!
This year we hosted an interactive workshop. Instead of just presenting yet another slide deck, we divided a large room into 4 classrooms – one for Trusted Network Connect NAC, one for storage security and two focused on the TPM and how to use it. Attendees grabbed a sandwich at noon and then headed into the session of their choice.
After 45 minutes or so, we switched, so everyone had a chance to visit all 4 classes. Each one was full . We got lots of great questions about setting up the TPM, what to do with it, how to manage large numbers of TPM-enabled boxes and how the TPM leverages existing applications.
Over in TNC land, our enthusiastic instructors demonstrated how to keep the bad guys off the network and how to evaluate the health of those trying to jump on – with products that are available today and that many might already have.
And our friends in storage covered a lot of group by talking first about local storage security – i.e. Solving the lost laptop problem and associated costs/risks. They then did a dive into the enterprise storage area – showing new enterprise storage with full drive encryption based on TNC standards.
As soon as we’d answered the many Qs, we broke down and hustled to the show floor for the 6 pm opening – we had all the workshop demos from Juniper Networks, Seagate, Wave and Infineon there as well as a few new ones. Fujitsu joined us with a neat demo of TNC linked to palmprint biometrics for authentication! Identity Engines, which ships an open source supplicant for NAC, showed its capabilities for protecting the network
Study: Many Computer Users Repeat Passwords For Several Accts
SAN FRANCISCO (AP)--Using the same password for multiple Web pages is the
Internet-era equivalent of having the same key for your home, car and bank
safe-deposit box.
Even though a universal password is like gold for cyber crooks because they
can use it to steal all of a person's sensitive data at once, nearly half the
Internet users queried in a new survey said they use just one password for all
their online accounts.
At the same time, 88% of the 800 people interviewed in the U.S. and the U.K.
for the survey by the Accenture consultancy, which is to be released Thursday,
said personal irresponsibility is the key cause of identity theft and fraud.
Researchers say the findings suggest that many users underestimate the
growing threat from organized cyber criminals who can reap big profits from
selling stolen identities.
"There's a lot of confusion out there - a lot of people don't think there's
a problem," said Robert Dyson, a senior executive in Accenture's global
security practice. "There's still the kind of head-in-the-sand situation: 'My
identity hasn't been stolen. I don't know anybody who's had their identity
stolen. So it must not be happening.'"
Dyson said the problem with repeating passwords is that a hacker who
successfully breaks into one account then has an easy time guessing how to get
into all the user's other accounts.
Many users repeat passwords so they do not forget them, which shows in
another finding that 70% of survey respondents in the U.K. said they don't
write down their passwords, versus 49% in the U.S.
Only 7% of the respondents said they change their passwords often, use
password management software or use a fingerprint reader to access their
machines and accounts.
The survey looked at people who used a computer at home, have high-speed
Internet access and go online at least twice a week for something other than
checking email. The respondents were selected at random and questioned over
the telephone. The mean age was 46.
The survey's margin of error was plus or minus 3.5% for the total sample and
plus or minus 4.9% for U.S. and U.K. samples.
Accenture noted that the results represent the behavior of a random sample
of this subgroup of Internet users, not the overall general pool of U.S. and
U.K. consumers.
National Security Agency
The U.S. government's primary agency for the gathering of electronic intelligence, the National Security
Agency (NSA), has recently adopted a standard for full disk drive encryption based on the TPM. Published
NSA documentation cites how the agency has instituted full drive encryption in response to the broad
impact of data theft, the large variety of parties affect by such thievery, new legislation requiring
institutions to be responsible for data security and integrity, and changes in computing infrastructure and
user practices, all of which require both innovation in security measures and cooperation among branches
of the agency and other government agencies to protect stored data from theft and loss.
Following a practice long established with respect to incubating private industry, the NSA chose a solution
developed by the U.S. information technology industry. Also, the agency thought that using TCG
architecture would help ensure compatibility among and reliability of its vast array of PC clients. Benefits
of the architecture that the NSA cited for putting the encryption function on the disk drives included cost
effectiveness based on economies of scale, transparency in the sense that encryption implemented directly
on the drive would have zero performance impact, and the ease with which the standard architecture could
be sustained in future disk drive products.
The agency described some of the details of its implementation. Since access control is set by default to
"off," user action is required to achieve security. When a new PC is deployed, the user must set his or her
password to lock the drive. Thus, the solution depends on a robust set of institutional policies to support
drive encryption, including education for users and IT support personnel. The agency chose this method so
that users wouldn't find themselves locked out of drives before setting up their own access.
Government slog/progress:
http://www.whitehouse.gov/omb/egov/c-5-authent.html
E-Authentication
Hundreds of Federal services are available to Americans electronically, but many require some form of identity verification before an agency-to-citizen or agency-to business transaction can take place.
It takes an estimated 3 to 5 years for Federal agencies to develop electronic identity authentication systems. Duplicative agency efforts to create such systems, which do not communicate with each other, are a substantial cost burden for the government. Moreover, the public is burdened by having to complete a separate registration process (e.g., user name, password, or other electronic credential) for each agency with which they want to conduct on-line transactions.
The E-Authentication Initiative will provide a trusted and secure standards-based authentication architecture to support Federal E-Government applications and initiatives. This approach will provide a uniform process for establishing electronic identity and eliminate the need for each initiative to develop a redundant solution for the verification of identity and electronic signatures. E-Authentication’s distributed architecture will also allow citizens and businesses to use non-government issued credentials to conduct transactions with the government.
Successful implementation of E-Authentication will produce numerous benefits for the public and the Federal government. Citizens and businesses will have a secure, easy-to-use and consistent method of proving identity to the government and will be spared the burden of having to keep track of multiple sets of registration information. Federal agencies will be able to reduce authentication system development and acquisition costs and reallocate labor resources previously used to develop such systems.
NSA to offer a secure platform!!!
Agency prepares to certify its first high-assurance system for outside use
The HAP standards and specifications include a mandatory trusted platform module (TPM) to carry out essential security functions, such as:
Generating asymmetric keys
Encrypting and decrypting data
Handling the keys that TPMs sign and exchange
Generating random numbers
Hashing data to secure it in transit and prevent improper access
Perhaps the most advertised improvement that the HAP releases offer is the increasingly sophisticated use of virtualization.
By Wilson P. Dizard III
Story Tools:
Print this Email this Purchase a Reprint Link to this page The National Security Agency is spearheading a team of intelligence agencies and information technology vendors in an effort toward broader use of secure multilevel workstations based on High Assurance Platform (HAP) standards and specifications.
NSA expects this year to approve outside use of HAP systems, which foreshadows the technology’s adoption by federal agencies that handle unclassified data in addition to private companies and eventually individuals, specialists in the field say.
NSA and its vendors expect to complete the technical and legal reviews that constitute the certification and accreditation (C&A) required before HAP systems can be cleared for use in the secret-and-below intelligence (SABI) world. Early HAP systems have been used in the top-secret-and-above intelligence arena for many months.
More to come
The C&A milestone will clear the first HAP release, HAP r1, for use by the SABI community.
The release builds on earlier technology work but doesn’t include some of its most eagerly awaited features, said Ed Hammersla, chief operating officer at Trusted Computer Solutions.
“HAP r4 will include the cornerstones of the HAP technology,” he said. “That is due in 2012.”
Hammersla said the virtual computing features in HAP can strengthen security and provide electricity savings for agencies and companies.
“For example, companies that operate electricity grids and pipelines have become concerned that their general business-side computers, such as the mainframes used for accounting, could provide pathways for insiders to drill through to the supervisory control and administration (SCADA) systems that regulate their networks,” Hammersla said.
SCADA system vulnerabilities have attracted widespread scrutiny as a weakness that terrorists could exploit to devastating effect.
Hammersla said HAP’s virtualization features and NSA’s work to assure that the platform design delivers on its potential for stronger security could even lead to greatly upgraded household computers.
“An individual user could create a secure zone for sensitive personal financial information while allowing less-trusted systems to access other parts of a home computer,” Hammersla said.
NSA told GCN in an e-mail response to questions that IT vendors could reuse the pending HAP security C&A as they develop various systems that use the platform specifications.
The HAP designs and specifications rely on shared use of features such as those called hardware root of trust and dynamic root of trust for measurement.
Those elements embed upgraded security features in chips and boards that strongly resist software attacks, sources in the intelligence community say.
The HAP standards and specifications include a mandatory trusted platform module (TPM) to carry out essential security functions, such as:
Generating asymmetric keys
Encrypting and decrypting data
Handling the keys that TPMs sign and exchange
Generating random numbers
Hashing data to secure it in transit and prevent improper access
Perhaps the most advertised improvement that the HAP releases offer is the increasingly sophisticated use of virtualization.
Intell benefits
HAP systems’ use of virtualization, an approach that builds on NSA’s earlier NetTop architecture, could produce a clutch of intelligence technology benefits, program specialists say.
For example, HAP’s virtualization features are designed to:
Reduce costs by progressively consolidating redundant systems that now maintain security via air gap, or physically separate networking.
Help intelligence practitioners create domains, or secure communities of interest focused on a particular problem, on the fly.
Exploit the capabilities of new chips and chipsets from Intel and Advanced Micro Devices that promise to simplify system architecture and embed additional security into hardware rather than the software methods now used.
The new chip designs will improve HAP systems’ integrity by facilitating remote attestation.
This process allows computers that communicate with one another across domains via classified networks to verify each machine’s right to access or modify data.
Doma, I spoke with Wave today....
As I understand it the entire VOSTRO line has been outsourced to Asustek and all Dell does is slap a logo on it. They ship from Asustek with Infineon TPMs and Infineon software. Wave's software works on top of Infineon's. It's confusing but not an issue.
And you're basing these statements on what? e/
Wave mention:
And Wave Systems demo'd strong authentication using its Embassy software for managing hardware security. "We don’t do encryption. We are protecting the data," says Lark Allen, executive vice president of Wave Systems.
Wave showed tools that support the next-generation Intel Centrino 2 with vPro, with TPM v 1.2. It also demonstrated management of the Seagate Momentus 5400 FDE.2 line of full-disk encryption drives.
RSA: Hashing Out Encryption
Vendors at RSA 2008 rolled out tools that make encryption easier to use and manage
http://www.darkreading.com/document.asp?doc_id=151078&WT.svl=news2_1
RSA: Hashing Out Encryption
Vendors at RSA 2008 rolled out tools that make encryption easier to use and manage
APRIL 14, 2008 | 5:50 PM
By Kelly Jackson Higgins
Senior Editor, Dark Reading
SAN FRANCISCO -- RSA 2008 Conference -- The conference that once was just a gathering of a few cryptographers is now a major event that drew more than 17,000 attendees last week. And the technology that started it all -- encryption -- showed it has grown a lot, too.
The big themes among encryption vendors exhibiting and rolling out new products here included managing encryption across the enterprise, making encryption easier to use, and a shifting focus from the nitty-gritty of encryption keys to the data itself. These themes aren't exactly new, but they were more front-burner than in years past, thanks to a busy year of high-profile data breaches, PCI mania, and laptop-theft paranoia.
With the pioneers of encryption chatting it up in the annual Cryptographer's Panel here as a backdrop, encryption vendors on the exhibit floor rolled out next-generation encryption management products and tools that help make encryption less a technology of complicated algorithms and key pairs and more of a mainstream business security strategy. But that doesn’t mean encryption is streamlined -- organizations today typically run a patchwork of separate encryption systems for various elements in their networks, from their files to their laptop hard drives.
Around 21 percent of U.S. enterprises surveyed in a Ponemon Institute and PGP study released this month say they currently have a consistent encryption strategy implemented across their organizations, which is an increase from last year, when only 16 percent did. Nearly 75 percent have an encryption strategy that's based on a type of data or application or is enterprise-wide, according to the study.
The number one reason for adding encryption: data breach prevention, with 71 percent of the vote, up from 66 percent last year, the study said. The most common encryption today is laptop encryption, which 20 percent of respondents use most of the time.
"Separate encryption systems all handle keys differently, and it's a policy" mess, says Gretchen Hellman, senior director of marketing for Vormetric, which specializes in policy-based encryption, access control, and auditing. Hellman is also the daughter of Martin Hellman of Diffie-Hellman algorithm fame.
RSA, the security division of EMC, here released its RSA Key Manager for the Datacenter product, which aims to centralize and integrate the lifecycle management of keys in the enterprise -- including in the database, file servers, and in storage systems.
"Multiple point encryption solutions, each with their own approach to encryption key management, increases management complexity and the risk of lost or stolen keys," said Dennis Hoffman, RSA's chief strategy officer, vice president, and general manager of its data security group, in a prepared statement.
According to the Ponemon-PGP study, organizations plan to spend 34 percent of their overall budget for encryption on key management (which includes key lifecycle, policy, and reporting), and 45 percent expect those systems to save them money on their data security costs.
Vormetric, meanwhile, rolled out what it calls the Key Security Expert, a tool for providing key security and access control for encryption keys across various encryption platforms in an enterprise. "It's a method to immediately address this ability to secure and control access to keys locally," Vormetric’s Hellman says. "Any third-party encryption key or homegrown solution -- we can control access to it."
Venafi, which sells what it calls systems management for encryption, demo'd its upcoming Encryption Manager V system at RSA, which will come with symmetric key support and enhanced auditing. Paul Turner, vice president of product and customer solutions for Venafi, says the new encryption management platform contains more policy-based management. It also integrates with existing key management tools.
"Most people are not key experts. So we had to make the policies simple," Turner says. Venafi doesn't provide encryption, just the systems management tools for it, he says.
BitArmor, meanwhile, upgraded its DataControl encryption software with support for Vista and Windows Server 2008, and plans to add management for Windows BitLocker Drive Encryption in the third quarter. "There are various types of encryption, but they are all separately focused on the device or app," says Patrick McGregor, BitArmor’s Chief Executive Officer. "We are taking an approach at the data level... we protect data at the core, and the keys are in the data itself. It's persistent encryption, a more elegant solution."
Other encryption announcements here included Voltage Security's new software-as-a-service model for its SecureFile encryption for documents and files, as well as increased systems integrator support for its format-preserving encryption technology, which encrypts data without changing the structure of the data. "Our goal is to make encryption usable," says Dan Beck, director of product management for Voltage, best known for its identity-based encryption technology for email encryption. The idea is to encrypt the data without changing the structure of the data, he says.
And Wave Systems demo'd strong authentication using its Embassy software for managing hardware security. "We don’t do encryption. We are protecting the data," says Lark Allen, executive vice president of Wave Systems.
Wave showed tools that support the next-generation Intel Centrino 2 with vPro, with TPM v 1.2. It also demonstrated management of the Seagate Momentus 5400 FDE.2 line of full-disk encryption drives.
So is encryption now considered mainstream? Bruce Schneier, chief security technology officer for BT, says encryption today is "surprisingly mainstream," even though you can't really see it. "People don’t buy encryption, they use it," he says of end users. "It's in their browser, their VPN" connections. "And when it becomes ubiquitous, it disappears" into tools and products, he says.
Have a comment on this story? Please click "Discuss" below. If you'd like to contact Dark Reading's editors directly, send us a message.
PGP Corp.
Small Business Feedback Drives Dell Vostro Laptop Redesign
http://www.edubourse.com/finance/actualites.php?actu=39760
Major step in expanding entire laptop portfolio by 50 percent in 2008
Thinner, lighter form factors
More security features than comparable, competitive products1
Dell today announced its redesigned VostroTM laptop line for small businesses, including the new 13.3-inch Vostro 1310 starting at $749 and 15.4-inch Vostro 1510 starting at $599. Products are available today in Europe, Middle East and Africa, followed by North and South America May 1, and Asia Pacific and Japan May 5. The company will also offer a redesigned 17-inch Vostro 1710 laptop in mid May. For more information, visit dell.com/newvostro and Dell’s new small-business blog direct2dell.com/smallbusiness to join the conversation.
“Listening to customers and acting quickly on their feedback to improve our products and services sets Dell apart,” said Frank Muehleman, vice president and general manager, Dell Small and Medium Business. “For example, customers told us in a recent global laptop study that data protection and pre-installed security software are, by far, the most important security features.
“They also said they wanted thinner and lighter machines, all at a good price,” Muehleman said. “We paid attention.”
The new Vostro laptops deliver the following customer-driven features: A sleek design, with slot load optical drive, that is thinner and lighter than previous generation Vostro laptops;
The display screen of the new Vostro 13.3-inch has 94 percent the viewing area of a 14.1-inch model, yet weighs nearly 20 percent less;
Continuation of Dell’s 30-day worry-free return policy;
Available with no trialware and free services including 10GB of Dell DataSafe Online Backup, Network Assistant and PC TuneUp; and
Vostro 1310 and 1510 laptops offer more security features than comparable competitive products¹ with integrated fingerprint readers, Trusted Platform Module (TPM) 1.2 helping prevent unauthorized network access and data theft, cable lock slot, and Dell-exclusive, factory-installed McAfee® Total Protection for Small Business security software2.
Small businesses are increasingly adopting laptops over desktop PCs. Today, half of PC-owning small businesses in the United States own laptops, and the number is expected to grow to almost 57 percent by 20123, with similar laptop penetration in Western Europe and Japan.
To address this shift and provide more choice to customers, Dell plans to expand its laptop portfolio by 50 percent this year, to deliver the broadest product lineup in the company’s history.
“After experiencing the productivity gains of the Dell Vostro 1510 compared to our older systems, I knew it was time to act on our dream to upgrade our IT network,” said Cynthia Ebrom, founder and owner of Edinburg, Texas-based Cynthia’s Cakes. “As our custom cake business expands nationally, technology becomes even more important to our growth and ability to better serve our customers. Vostro gives us the reliability, performance, service and support we need without exceeding our tight budget.”
In addition to investments in its laptop portfolio, Dell is expanding its services, server and storage offerings to meet the unique needs small and medium businesses. Dell’s new services portfolio, ProSupport, gives customers the ability to customize and tailor services to fit their technical expertise and business needs.
Dell’s new PowerEdge R300 and T300 servers have industry-leading performance, memory and high-availability features so small and medium businesses no longer have to sacrifice server performance or reliability for an affordable price.
Through affordable iSCSI storage area network (SAN) solutions like the PowerVault MD3000i and Dell EqualLogic PS5000 Series, Dell is simplifying storage for businesses of all sizes.
Microsoft's trust chief steers his company toward Trusted model
http://www.betanews.com/article/Microsofts_trust_chief_steers_his_company_back_toward_Trusted_model/1208206883
By Scott M. Fulton, III, BetaNews
April 14, 2008, 5:24 PM
The first time Microsoft launched a Trustworthy Computing initiative, it was met with skepticism, especially with the way Bill Gates played it up. But six years later, a key Microsoft executive suggests it may be time to revisit the subject.
In a surprisingly frank white paper from the man in charge of Microsoft's Trustworthy Computing strategy, released this morning, Corporate Vice President Scott Charney writes that his company's own first two major initiatives toward providing greater security for software and Internet users fell short of their intended goals, and that a third initiative just now getting under way may still fail to completely address the problem of ensuring consumer safety and privacy.
A former US Justice Dept. official before joining Microsoft, Charney writes in "Establishing End-to-end Trust" (PDF available here) that a key goal of trustworthy computing is still to reliably authenticate users and the companies they represent, especially in business transactions. But the rapidly evolving nature of social computing, coupled with the curious requirement among consumers for not just privacy but anonymity, has thrown the biggest monkey wrench into the system.
"Ensuring that people can be identified raises the most complex social, political, and economic issues, with the No. 1 issue being privacy," Charney writes. "The concern is twofold: (1) If authenticated identity is required to engage in Internet activity, anonymity and the benefits that anonymity provides, will be reduced; and (2) authenticated identifiers may be aggregated and analyzed, thus facilitating profiling."
Identification is critical in the Trusted Computing model that Charney represents and promotes, because every computing transaction in this model, whether on the Internet or locally, must take place between either people or components that can identify themselves, and whose stated identities can pass a reasonable challenge. In a hypothetical world where every component does identify itself according to protocol, it can be assumed that the first task of a malicious user will be to bypass the system of identification, perhaps through spoofing someone or something else, and perhaps by overriding that particular step.
Over the last decade, Microsoft has had to play catch-up in this department, mainly because the distributed computing model it wanted to deploy first over the network -- the Component Object Model -- failed to include any rigorous method of authentication. Since then, the company has moved in stages toward more thoughtful practices, but even the act of migration has exposed some vulnerabilities which malicious users cannot resist the temptation to exploit.
Microsoft's first plausible initiative in this regard, Charney writes, was its "Secure by Design" principle, the current version of which is called SD3. The idea there was to stop producing software whose most exploitable features were turned on by default.
"There was, in fact, nothing wrong with this strategy as a foundation, and SD3 remains important today," Charney wrote. "The problem with SD3 lies in its inherent limitations. Even if products are engineered to be 'Secure by Design' and vulnerability counts continue to drop, it is indisputable that the number of vulnerabilities in large and complex products (several of which are likely to be installed on a single system) cannot be reduced to zero in the foreseeable future. 'Secure by Default' is inherently limited because the attack surface can only be reduced, not eliminated, and features are created precisely because a broad set of users need the feature activated. Similarly, many legacy software applications require the user to run as 'admin,' thus undermining some of the intended security benefits of running as a standard user."
In addition, he added, the practice of releasing patches in regular batches (with a nod to Dr. Seuss) actually helped spawn a cottage industry in reverse-engineering. The patches actually provide a road map to the problem, when a malicious user holds them up to a mirror.
So Microsoft moved on to its second initiative, "Defense-in-Depth." That had a lot to do with strengthening Windows' firewalls and turning off more features by default. But after users have seen all those warnings for the umpteenth time, Charney writes, "it remains true that users will click on malicious attachments sent to them from unknown sources."
And while it's nice to have reduced the attack surface on the surface by turning off volatile features by default, he notes that the reason those features were developed in the first place is so that they could be turned on. So just the off switch isn't enough.
What Charney advocates as a next course of action for Microsoft is a move back toward a bolder, more daring vision of security that it backed away from when "Secure by Design" was launched: a vision that incorporates more of the Trusted Computing principles that Chairman Bill Gates first advocated back in early 2002. Those measures were met with widespread skepticism as the whole notion of "Trusted" or "Trustworthy Computing" coupled with the Microsoft brand sparked notions of Big Brother, or of turning over control of users' hard drives to Hollywood studios. Eventually the negative publicity was so bad that its former Trusted Platform partner Intel steered clear of Microsoft's strategy in 2006.
In light of the less-than-complete objectives of Microsoft's first two public initiatives, though, a third one dedicated to the Trusted Platform may be met more positively today, Charney implies. But the biggest issue blocking that from happening now isn't the fear of Big Brother or DRM, he believes, but the Internet-using public's simultaneous insistence upon anonymity, privacy, and openness. Not all three may absolutely coexist, he suggests.
However, in a curious argument, he proposes that anonymity in the social sense may be completely impossible if there were no infrastructural means of securely identifying the anonymous party, in order to help guarantee that anonymity.
As Scott Charney writes, "Clearly, this approach will not satisfy those who see the Internet's anonymity as the ultimate protector of privacy. This may particularly be true in those cases where anonymity promotes and protects unpopular speech. But the fact remains that if we hope to reduce crime and protect privacy, we need to give users the ability to know with whom they are dealing (if they so choose) and law enforcement the capability to find bad actors.
"It is also important to remember," he continues, "that there are multiple privacy interests at stake here; for example, in the e-mail context it is not just the sender of a communication who may have a privacy interest, but the recipient may wish to be left alone. Indeed, any regime should not only seek to provide greater authentication to those that want to provide it or consume it, but also provide anonymity for those who wish to engage in anonymous activities. Users should be able to choose to send anonymous communications, and users should be able to choose to receive mail only from known sources."
Storage Managers Struggle With Security Demands
http://www.enterprisestorageforum.com/continuity/news/article.php/3740361
April 11, 2008
By Marty Foltyn
ORLANDO, FL. — Perhaps the biggest surprise at this week's Storage Networking World was just how central the role of security has become, as storage managers are increasingly pressed into service to plug data leaks and ensure compliance with data protection regulations (see Storage Becomes the Center of the Security Storm).
EMC boosted its storage security offerings, HP, IBM and Vormetric unveiled encryption key management products, and Seagate, IBM and LSI promoted disk drive encryption. And those were just a few of the announcements between this week's SNW and RSA conferences.
A tutorial at SNW by Roger Cummings of Symantec illustrated how storage managers can eliminate much of their vulnerability by using the right technologies for encryption. Cummings defined encryption as the conversion of plain text to encrypted text with access only by authorized users. He outlined a number of methods for protecting both data at rest and in flight, including encryption/decryption built into tape drives, and disks that encrypt data before storing it on media.
Cummings outlined a nine-step checklist for encrypting data at rest, beginning with understanding the reasons for confidentiality and working closely with legal counsel and company executives to identify regulatory obligations and develop IT strategic plans. Activating encryption is the last step, after classifying and inventorying assets, performing data flow analysis, encrypting as close to the source as possible, designing the solution with a focus on demonstrating the chain of evidence, and beginning data realignment to implement the solution.
Deploying fabric-based encryption was the recommendation of Roger Bouchard of Brocade, who said this approach reduces complexity by using a common method for encrypting all types of data residing on any storage device connected to the storage area network (SAN).
Consultant Richard Austin recommended that managers focus their storage security efforts on data leaving a storage manager's control, including data stored on removable media, in third party untrusted data centers that must be protected both in flight and at rest, and data transferred between trusted data centers that must be encrypted in flight. Austin maintained that encrypting data at rest is a measure of last resort, requiring careful planning and methodic implementation.
The importance of a key management strategy generated considerable audience interest in a session led by Walt Hubis of LSI. Hubis recommended a series of best practices to deploy key management, including limiting the use of data encryption keys, enforcing strict access controls, and disposing of keys when no longer needed.
SNW attendees also got hands-on practice in using a software approach to encryption, performing hardware encryption with keys provided by the backup application, and reading tapes encrypted by one drive in another drive.
Protecting data is also a top priority for storage managers because of the potential for stiff fines for failing to produce data in e-discovery cases, according to David Stevens of CMU. In a session on the December 2006 changes to the Federal Rules of Civil Procedures (FRCP), Stevens discussed the need to preserve to the best of a manager's ability all the details of the original electronically stored information (ESI), if not producing the original itself. He said a company (and its storage manager) may be requested to produce ESI even if it is not a party to the litigation.
He recommended creating and following a company's data/ESI retention policy, including auditing compliance with the policy, knowing where data resides, knowing how to preserve ESI, and maintaining a chain of custody for the data. Stevens reminded attendees that at least 37 U.S. District Courts now require compliance with specialized local rules, forms and guidelines addressing the discovery of electronically stored information.
GREAT tpm mention by Microsoft................
Microsoft eyes less obtrusive security
http://searchsecurity.techtarget.com/news/article/0,289142,sid14_gci1309350,00.html
SAN FRANCISCO -- The future of Windows security likely will involve a kind of back-to-basics approach to preventing attacks and malware infections through the use of features such as application whitelisting, further integration of TPMs and more extensive use of code signing.
Microsoft Corp. has been working on many of these technologies for several years and some of them are already used in various forms in Windows XP and Vista, but the company is working on ways to make the operating system and core applications smarter and more efficient at blocking threats as early in the process as possible, according to Microsoft product unit manager David Cross.
During a session Thursday at RSA Conference 2008, he said the company is pleased with how such Vista security features as the User Account Control (UAC) have worked out, but that the company is seeking ways to make them more automated and less invasive for users.
"The reason we put UAC in Vista was to annoy users," Cross joked. "But seriously, we needed to change the ecosystem and we had to use a pretty heavy hammer to do it."
Cross said Microsoft has been analyzing data collected from more than a million Vista systems and found that the majority of user sessions don't have any UAC prompts in them, and that the number of programs that are generating UAC prompts is dropping.
Still, he said, Microsoft is looking to make the security features less obtrusive in Vista and future versions of Windows. Specifically, the company wants to make better use of things such as application whitelisting, which prevents any application from running other than those explicitly allowed by the user. This can not only enable administrators to prevent employees from running unwanted but legitimate applications like Skype or Gnutella, but can also stop malware from executing.
The company also has been working on better ways to isolate running applications and integrate code signing with UAC. Much of the work Microsoft is doing is a result of the decreasing effectiveness of classic signature-based defenses such as antivirus, IDS and antispyware software. Signatures are of little use against threats that shift tactics and behaviors continuously.
"The threats are more complex. It's a maze now. We're seeing on average about a thousand new threats every day," said Vinny Gullotto, head of Microsoft's Malware Protection Center, who spoke during Cross's session as well. "I'd say back in the days of LoveLetter and Nimda, we would see about 500 a month. Signature-based technology should be a final backstop. Behavior monitoring should be the main defense."
Gullotto said that sophisticated threats such as rootkits and custom Trojans used in highly targeted spear phishing attacks present unique problems that can't be solved with signature-based tools. "Rootkits are still a big concern," he said. "I don't think we've seen the peak of the problem with them yet."
Cross said he expects Microsoft to invest more heavily in a number of other security areas as well, including better integration of trusted platform modules into the computing environment.
OT-ish: NXP 'venture' conceals ST's chip strategy
(editor's note)This week's blockbuster deal between ST Microelectronics and NXP Semiconductors is a game changer for several reasons. First, it reveals how Geneva-based ST is transforming itself to target key semiconductor market segments like wireless chips and memory products. Second, as European editor Peter Clarke points out, the joint venture deal likely marks the end of chip manufacturing in Europe as top companies like ST and NXP move to a fabless model. The key motivation for this shift is reducing soaring operating costs. As we've reported, Advanced Micro Devices is feeling the same pressures. It appears likely the most chip makers with the exception of top dog Intel will move to the fabless model as a way to survive and, hopefully, prosper.
http://www.eetimes.com/showArticle.jhtml?articleID=207200040
The headlines are all wrong. STMicroelectronics isn't pooling its wireless business into a joint venture with NXP's wireless chip unit. No. ST is buying NXP's wireless business.
This distinction is important because it is critical to understanding how ST's management is steadily reshaping the Geneva-based company in response to events within the larger semiconductor world and investor dissatisfaction with its depressed stock prices.
The transaction gives NXP a much needed $1.5 billion cash injection. But ST, the majority owner of the new wireless IC company announced Thursday (April 10), gets a double benefit: reduced operating costs and the chance for a hefty payout whenever the JV sells shares to the public.
Over the last two years, ST has engineered several strategic deals that are apparently aimed at making the company leaner and more focused on specific market segments.
For instance, in December 2006 ST announced it would create a stand-alone flash memory group and consolidate all of its NAND and NOR products into the new division.
The transaction with NXP builds on the trend at ST towards reinforcing key operations through strategic acquisitions while spinning off specific businesses to reduce costs and gain market leverage.
While the companies are touting the latest deal as a ST-NXP joint venture, the details show this was all about ST. The company said it would make an initial down payment of $1.5 billion on the transaction to NXP while retaining the right to buy out its JV partner's remaining 20 percent stake.
"In order to create a clear ownership structure, STMicroelectronics will take an 80 percent stake in the joint venture," the companies said in a statement. "The parents have also agreed on a future exit mechanism for NXP's ongoing 20 percent stake."
Translation: ST will own the joint venture and NXP, the junior partner, will exit the organization as soon as the new company can afford to buy it out.
Encryption focus shifts to disk drives
http://searchstorage.techtarget.com/news/article/0,289142,sid5_gci1309340,00.html#
There were several technical reasons why early products aimed at storage encryption failed. They were often slow, did not scale well and had rudimentary key management. But perhaps they were just aimed at the wrong media.
The first encryption appliances from vendors, such as Decru (now part of NetApp), Neoscale (now part of nCipher) and Kasten-Chase (now shuttered) were considered tape encryption devices. That made sense at the time because the biggest perceived threat to data at rest was lost tapes.
Now, the future of storage encryption has shifted to disk. EMC unveiled a disk encryption product at SNW, and while Seagate launched its FDE (full-disk encryption) Cheetah drives at the RSA conference, its storage partners IBM and LSI vowed their continued support at SNW.
EMC built encryption from its RSA security company into its PowerPath management software. EMC PowerPath Encryption with RSA encrypts and decrypts data at the host as it is sent to and from the disk array and uses RSA Key Management to manage encryption keys.
Why encrypt disk arrays that reside in the data center and aren't likely to get lost? Doc D'Errico, EMC vice president of infrastructure software, points out that disk leaves the data center more than most people think. When drives are repurposed or decommissioned, sensitive data can be at risk.
IBM first demonstrated disk encryption with Seagate and LSI last fall. IBM also encrypts tape in its TS1120 libraries, and its storage general manager Andy Monshaw told an SNW audience that encryption is a technology that needs to be implemented across the board. "A few years ago, some tapes fell off a truck, and we saw a rush of encryption devices," Monshaw said. "But they were point products. They didn't scale
Intel predicts staggering growth for "netbooks"
http://www.pcpro.co.uk/news/186993/intel-predicts-staggering-growth-for-netbooks.html
Intel claims the low cost PC market that covers the Eee PC and desktop equivalents is ready to explode in growth.
The chip giants is anticipating sales of around 47m of the units from various manufactures between 2008 and 2011, but topping 100m if you count handheld internet devices which the company is taking a renewed interest in following the release of its Atom platform.
Unsurprisingly, Asus is expecting to profit significantly from this boom in interest, with president Jerry Shen predicting shipments of 10m in 2008, and 20 million in 2009. Not bad considering it was originally envisioned as a niche laptop for children and education.
The low cost laptop market is one that nearly every major manufacturer has now expressed an interest in, with Intel gearing up to launch its Classmate 2 on European shores, Asus readying the next generation of Eee PC, HP unveiling the Mini-Note and even Dell reportedly scouting out the terrain.
Intriguingly, back in February Sony predicted the market would be in serious trouble if the Eee took off, claiming "If (the Eee PC from) Asus starts to do well, we are all in trouble. That's just a race to the bottom."
anvil.........
[FDE] Privacy Act's "actual damages" clause DOES NOT require actual monetary damages
In the recent American Federation Of Government Employees (plaintiff)
v.s. Kip Hawley, in his official capacity as Administrator for TSA,
the plaintiffs alleged that defendants violated the Aviation and
Transportation Security Act ("ATSA") and the Privacy Act by failing to
establish appropriate safeguards to insure the security and
confidentiality of personnel records which resulted in unintended
disclosure of Personally Identifiable Information (PII) of 100,000 TSA
employees.
The defendants argued that "that the individual plaintiffs should be
dismissed for lack of standing for failing to demonstrate an
injury-in-fact. Mot. Dismiss at 13.11 According to defendants,
plaintiffs' concerns about future harm are speculative and dependent
upon the criminal actions of third parties. Mot. Dismiss at 13–15"
The court, however, disagrees:
"Plaintiffs allege that because TSA violated § 552a(e)(10) by failing
to establish safeguards to secure the missing hard drive, they have
suffered an injury in the form of embarrassment, inconvenience, mental
distress, concern for identity theft, concern for damage to credit
report, concern for damage to financial suitability requirements in
employment, and future substantial financial harm, [and] mental
distress due to the possibility of security breach at airports."
Compl. 41–42. As such, plaintiffs' alleged injury is not speculative
nor dependent on any future event, such as a third party's misuse of
the data.12 The court finds that plaintiffs have standing to bring
their Privacy Act claim."
For details see:
https://ecf.dcd.uscourts.gov/cgi-bin/show_public_doc?2007cv0855-6
http://cyberlaw.stanford.edu/node/5734
The outcome of this could have far-reaching implications for the
future data leaks.
Intel Details Plans for Laptop Anti-Theft Tech
http://www.pcmag.com/article2/0,2817,2282714,00.asp
Storage companies like Seagate have also released hard drives with Full Drive Encryption (FDE) technology, which encrypts the drive's contents and prevents them from being usedwithout the correct password.
Go-kite...
I put it up because I noticed they're FDE drives, too. Not because of data centers.
Seagate(R) Cheetah(R) Drives Enter New Habitat - Self-Encrypting Hard Drives for Servers and Storage Arrays
Posted : Mon, 07 Apr 2008 12:00:37 GMT
Author : Seagate Technology
Category : Press Release
News Alerts by Email click here )
Create your own RSS
News | Home
Security tools for mission-critical enterprise data drives the need for Seagate Cheetah 15K.6 FDE hard drives
SAN FRANCISCO, April 7, 2008 /PRNewswire-FirstCall/ -- RSA CONFERENCE -- Seagate Technology today introduced a new breed of hard drive, the Cheetah(R) 15K.6 FDE (Full Disk Encryption) disc drive family, the world's first self-encrypting hard drives for mission-critical servers and storage arrays. As part of the award-winning Cheetah family, the industry-standard in performance and reliability in data centers, the new Cheetah 15K.6 FDE hard drive now also encrypts data as well. And that encryption goes anywhere the hard drive goes -- whether it is moved, stored, or retired.
"The data breaches widely reported in the media generally focus on stolen laptops and PCs, but people forget about the staggering amount of information leaving the data center daily," said Sherman Black, senior vice president and general manager, Seagate Enterprise Compute Business. "Equipment and systems with hard drives inside are continuously being retired, relocated or repaired and there's often little thought given to properly disposing of the data they contain before they leave the data center. A recent investigation showed that 50% of the drives returned for servicing by customers contained readable sectors. If you assume that an average system's lifecycle is three to five years that suggests that more than 50 thousand enterprise drives are leaving data centers daily worldwide. If only half of those hard drives are readable, that's at least 2,500TB per day of exposed data available in the open market. The increasing flow of exposed sensitive data ought to be a serious concern to CIO's everywhere."
Compared to other encryption technologies, self-encryption within the hard drive brings significant performance, management, and security benefits for users. Since the encryption engine is in the drive's controller ASIC, encryption is transparently fast and performance automatically scales with every drive added to a data center. Because there is no performance cost associated with encrypting more data, there is no need to make fine-grained decisions as to what data to protect -- which can eliminate the need for data classification. Self-encryption requires no change to the OS, applications, or databases. Instantaneous Key-Erase(TM) technology, a standard on all Seagate FDE hard drives, facilitates quick and secure removal, whether for repurposing, returning for service, or disposal.
Leading analysts, standards bodies, and major storage providers have closely evaluated and concluded that self-encrypting drives deliver critical benefits for data center information security.
Gartner:
"Many organizations are considering drive-level security for its simplicity in helping secure sensitive data through the hardware lifecycle from initial setup, to upgrade transitions and disposal," said Eric Ouellet, research vice president, Secure Business Enablement, Gartner. "Hard drive disposal in particular has always been one of the most challenging elements of the data security lifecycle. Even with secure disposal processes in place, misplacement, mislabeling and theft still do occur which can result in significant losses, possible penalties and fines. Eliminating the risk of compromise from the source is one approach that can significantly reduce the complexity of managing sensitive data."
IBM:
"Enterprise customers today, especially in the financial services sector, have a keen interest in protecting data-at-rest. Natively securing data in the storage drive without any system performance degradation is the next frontier in truly securing data and storage media that eventually leaves the data center," said Robert Cancilla, vice president of Disk Systems, IBM. "Introduced over a year ago, IBM's self encrypting tape solutions, featuring key management software, continue to address this security priority without the traditional impact to I/O performance, while simplifying encryption key management. We are excited about working with Seagate to bring this model to our disk offerings."
LSI:
"With the continued headlines about data breaches and emerging government mandates, the need for data-at-rest encryption has come front and center," said Phil Bullinger, executive vice president, Engenio Storage Group, LSI. "By encrypting at the drive level, users can potentially eliminate application impact and reduce worry about drives being lost, stolen or repurposed. LSI is pleased to be working with other industry leaders to drive critical developments in standards-based encryption technology."
Trusted Computing Group:
"Full drive encryption enabled in hardware and based on the open standards created by the Trusted Computing Group with Seagate's leadership can give administrators and users confidence that data will be encrypted quickly, easily and always," noted Brian Berger, Trusted Computing Group marketing work group chair. "As demonstrated by the rapidly increasing number of lost, stolen or hacked drives, encryption in hardware really is the most effective solution to help ensure the security of at-rest mission-critical information. Otherwise, corporations potentially face millions or worse in government fines, lost business, lost goodwill and undermining of other corporate relationships."
As a complementary solution for data protection of non-encrypted drives that leave the data center, the Seagate Recovery Services division recently announced its Data Erasure suite of products that utilize a series of advanced, defense-industry rated and approved algorithms to completely and permanently erase all data from a disk drive to ensure that proprietary and sensitive information does not get into the wrong hands. For more information, please contact a Seagate Recovery Services expert at 800-475-0143 or visit http://services.seagate.com/srs.
Details about the Cheetah 15K.6 FDE Family of Hard Drives
Available in capacities of 450GB, 300GB, and 147GB, the Cheetah 15K.6 family includes Seagate PowerTrim(TM) technology which dynamically optimizes drive power consumption at all levels of activity. The Cheetah 15K.6 FDE family offers the highest 3.5-inch hard drive reliability in the industry at 1.6 million hours MTBF (0.55% AFR), a choice of Serial Attached SCSI (SAS) or Fibre Channel (FC) interfaces, and a five-year limited warranty. The Cheetah 15K.6 FDE drive is shipping to OEM suppliers this quarter.
Encryption Technology at RSA
Seagate will be conducting a number of demonstrations at the RSA Conference this week (booth #1448) to showcase its enterprise, notebook, and external hard drive solutions. In addition, Seagate has teamed with Wave Systems (booth #728) and the Trusted Computing Group (booth #2723) to jointly conduct client and server demonstrations. For more information about the RSA Conference program, visit: http://www.rsaconference.com/2008/US/home.aspx
EMC, IBM, and Seagate talk drive encryption for data center storage
Storage Drive Encryption Featured at Shows
April 09, 2008
By Mary Jander, April 9, 2008, 1:00 PM
ORLANDO, Fla. -- Cisco, EMC, IBM, Seagate, and LSI are talking up disk drive encryption for data center storage this week, but they're not necessarily on the same page.
IBM was first to reveal plans Tuesday at SNW to deploy the same kind of encryption integral to its TS1120 tape library on its DS series disk arrays.
IBM's plan is based on its relationship with Seagate, which unveiled its Cheetah 15K.6 FDE (Full Disk Encryption) drive at the RSA show in San Francisco on Monday, as IBM was talking things up in Orlando. Seagate first unveiled FDE for laptops last summer.
Seagate plans to ship the new FDE drive this summer. If it does, it will be among the first major drive vendors to offer an encrypted drive for data center applications. Hitachi Global Storage Technologies has encrypted drives for notebooks, but doesn't presently offer them for data center gear.
LSI got into the act yesterday with a suite-based press interview at SNW, during which Claudine Simson, LSI's CTO, vowed that LSI would continue to support Seagate's drive-level encryption in the future, principally by supporting Seagate drives in the arrays IBM OEM's from LSI. These arrays include the IBM DS3000 series of lower-end enterprise systems and IBM's DS4000 midrange SANs.
Besides supporting standard AES 128 encryption in the Seagate-based systems, LSI plans to create silicon used in the encryption infrastructure that also performs content processing and analytics, according to Simson. "We will use enabling silicon to make sure keys are properly managed," she said.
Separately, EMC announced the EMC PowerPath Encryption with RSA, an integration of RSA's key management capabilities into the PowerPath multipathing technology used in EMC SANs. The package will manage encrypted keys for Symmetrix and Clariion systems, as well as for other storage gear linked to them. (EMC says it supports key management from a range of vendors, including IBM, via the RSA software.)
EMC also was part of a broader announcement with Cisco regarding data protection and security, which was made at RSA. The focus on the announcement was RSA's recently announced DLP Suite, which offers deep packet inspection of data to help identify information that might be sensitive to the enterprise.
This week's announcements show that key suppliers are intent on delivering encryption for storage managers, who are struggling not only with the performance hits encryption takes on their systems and networks, but also on the multiple encryption keys generated by systems and software.
At the same time, the news shows that the vendors continue to work separately toward better solutions.
Seagate, LSI, and IBM think that putting encryption locally into drive chips solves any performance problems, and they see this as the approach to solve key management woes as well. The vendors are working in the IEEE to standardize a uniform format for key management via the P1619.3 working group, although that work seems to be happening in the background, secondary to their main focus on productizing disk-based encryption.
EMC has participated in IEEE P1619.3, but spokespeople could not identify ongoing efforts at press time. EMC spokespeople position PowerPath Encryption for RSA as a product that competes with IBM's projected offering but is presently available. Clearly, EMC sees its efforts as distinct from those of IBM and its partners.
Microsoft: Smarter IDs needed for Net gains
In terms of the work already being done along these lines, Charney pointed to projects such as the Trusted Computing Group's Trusted Platform Module (TPM) hardware-based encryption standard as an example of the type of initiative that will need to be expanded even further in the coming years.
http://www.infoworld.com/article/08/04/08/Microsoft-Smarter-IDs-needed-for-Net-gains_1.html
By Matt Hines
April 08, 2008
Microsoft's chief security strategists are asking for help.
The massive software vendor is working harder than ever to do its part to improve online security, but the company cannot solve all the electronic world's ills alone and must have broader support from across the IT and Internet communities to speed up progress, Microsoft officials said at the ongoing RSA Conference 2008 in San Francisco.
Despite the fact that Microsoft and other mainstream technology vendors have made a concerted effort to improve the quality of their products and services -- over the last several years, in particular -- to respond to the Internet's blossoming security epidemic, today's problems are too widespread and fast-moving to be addressed unless new industry standards and technological vehicles can be created to help foster stronger online protection, executives with the company said.
Only by driving industry collaboration around issues of online authentication and identity protection can the Web be made a place where people can again trust the systems and services they seek to use with any level of confidence, said Scott Charney, corporate vice president of Trustworthy Computing at Microsoft.
Just as Microsoft has utilized its Trustworthy Computing initiative in an effort to reduce the number of vulnerabilities in its products and integrate stronger security tools into its software and online services, the Internet community at large needs to readdress authentication and identity if it hopes to regain users' faith, Charney said.
The executive has also authored a 20-page white paper manifesto outlining Microsoft's hopes for broader collaboration around online security and trust. The company's chief research and strategy officer, Craig Mundie, outlined the vision further in his RSA keynote address on Tuesday.
"For a long time, the industry didn't do security well, and because of its market share, Microsoft became a very important player in all of this," said Charney. "We think that we've done a good job of improving things over the last six years, but still it's not enough, and we need industry cooperation to do more in the Internet space."
Charney's paper and Mundie's speech express the need for a vision of "end-to-end trust" to be embraced among many different technological and social constituencies to aid in everything from helping companies do business faster and more securely online, to better protecting children who access the Web.
Technology vendors, service providers, industry bodies, and government agencies must team to create methods by which people can communicate online with assurance about each others' identities while preserving important issues of privacy and anonymity, the experts said.
Microsoft's latest strategy calls for continued development of a "trusted stack" of IT products and online services, throughout which individual elements will authenticate with one another more comprehensively, reaching from the operating system all the way to end-user devices and applications.
Another prerequisite will be a system that includes elements of authentication and audit, while allowing individuals to preserve their identities online. The company also contends that there is a need for new industry standards and regulations that help the entire ecosystem to survive and flourish, Charney said.
"The things that we've done [at Microsoft] to date are foundational and need to be taken to the next level; we've made the OS more secure, but subsequently, the attacks have moved up the stack into applications," he said. "As an industry and as a society, we've already done a lot of good things to help improve online security, but a lot of the threats are such that we need to push this issue of trust and collaboration not just within the industry, but also with consumer groups, politicians, and privacy advocates."
In terms of the work already being done along these lines, Charney pointed to projects such as the Trusted Computing Group's Trusted Platform Module (TPM) hardware-based encryption standard as an example of the type of initiative that will need to be expanded even further in the coming years.
Microsoft will build hooks for more native systems of security and privacy into everything from its Windows OS and Office products to its own online properties and mobile device technologies, the executives said, but the company's central hope is that its call for action echoes across the RSA conference and the IT community at large, said Charney.
"Work with us, help inform us, we are a technology company, and we're trying to do a better job of engaging with important constituencies, including our governments. The result of those discussions bring into relief how hard these problems are, and how difficult the trade-offs will be," he said. "We're not presumptuous about this; it's easy to pose these questions -- the hard part will be finding the answers."
ClayTrader.. very interesting!!!!!!!!
thanks
Fullmoon
Microsoft message to security world: Trust Us
http://www.news.com/8301-10784_3-9914240-7.html
In a keynote at the RSA conference last year, Microsoft Chairman Bill Gates and Craig Mundie, chief research and strategy officer, said the company had more to do to improve security.
Mundie and Chris Leach, chief information security officer at Affiliated Computer Services, followed talking points about Microsoft's latest vision for End to End Trust, describing it as an industry call to action.
"The foundation has been laid for good security practices," Mundie said. "The challenge now is related to management practices."
It's all about establishing that you are who you say you are.
"We need new forms of credential," Mundie said. "You should be able to present a cert (certificate) that says, 'Hey, I'm over the age of 18'...and allow a Web site to know that you are an adult."
Mundie was laying out the parameters for Microsoft's vision for security so that the interested parties would build around the company's framework.
As if on cue, he said: "The overall management systems today are not integrated enough, they're too complicated. That has been a major focus for Microsoft." And he mentioned some Microsoft products that solve those problems.
I showed Bruce Schneier, chief security technology officer for BT, the End to End Trust documents and he said "it feels general and like marketing hype." The notion that the world needs centralized authentication "is just silly," he added.
Basically, Microsoft has used its trusted computing efforts, such as inserting identity rights management into Office 2003, to lock people into using its products, Schneier said.
"Microsoft has used this as an anti-competitive tool," he said.
In a briefing on Monday, George Stathakopoulos, general manager of Microsoft's Trustworthy Computing group, was mentally prepared for the criticism.
"With everything we do, there is always skepticism and conspiracy theories," he said. "The answer is no; this is for real."
Microsoft’s End to End trust vision:
Can this identity, trusted stack thing work?
Posted by Larry Dignan @ 9:46 am
Microsoft outlines the components of a trusted stack:
Because all software operates in an environment defined by hardware, it is critical to root trust in hardware. Today, many computers come with a Trusted Platform Module (TPM), a technology that will expand and enter new form factors…The operating system must be verifiable based upon keys stored in the hardware (e.g., “trusted boot”). This allows the device to claim that the operating system has not been tampered with to bad effect…Computers were, of course, designed to run code, without concern about its authorship or the intent of that author. Today there are multiple ways to help protect people from software vulnerabilities and malicious code. To protect users from vulnerabilities, code can be rewritten in safer languages, checked with analytic tools, compiled with compilers that reduce vulnerabilities (e.g., buffer overruns) and sandboxed when executed…A safer Internet needs to support the option of identities based directly or derivatively upon in-person proofing, thus enabling the issuance of credentials that do not depend upon the possession of a shared secret by the person whose identity or identity attribute is being verified. To some extent, government activities and markets themselves are driving in-person-proofing regimes…Applications should incorporate seamless mechanisms for applying signatures to their outputs, and read signatures before opening documents, so that data origin and data integrity can be easily checked….An audit trail is a record of a sequence of events from which a history may be reconstructed. An audit log is a set of data collected over a period of time for a specific component. A series of audit logs can be studied to determine a pattern of system usage that, over time, can be used to highlight aberrant behavior such as criminal activity or the existence of malware. Audit data is also necessary to roll back suspicious or harmful transactions.
TalkBackPrintEmailThumbs UpThumbs Down+11
Microsoft unveiled its “End to End Trust” security vision and now the real work begins: Will anyone buy into it?
At RSA, Microsoft rolled out a whitepaper that at the very least is quite the conversation piece.
Reading through the whitepaper–something I encourage everyone to do (yes all 22 pages)–I can see a few phases on this one:
First, the mocking: “Who is Microsoft to pitch this trusted Internet thing?”
Then the details: “This whole thing is based on identity management on a grand scale. Sounds Big Brother-ish.”
Then the idea of the trusted stack: “Is this like herding technology industry cats?”
And then some mild acceptance: “Maybe some of these ideas aren’t half bad.”
Ok, that last phase may take awhile (like maybe never), but the debate is worth having. Here are some key excerpts and my take.
Microsoft says:
This paper is an invitation to discuss how one might fundamentally “change the game,” and provides a framework for discussing the myriad of social, political, economic and technological issues that must be addressed if we want to create a meaningfully more secure and privacy-enhanced Internet. In short, in our view changing the game requires two things: (1) building a trusted stack, with suitably strong authentication of hardware, software, people and data; and (2) improving the ability to audit events to provide accountability. We must also grant people better control over their digital personas to enhance privacy. This trusted stack, combined with better mechanisms to protect privacy, will enable End-to-End (E2E) Trust — giving people, devices and software the ability to make and implement good decisions about who and what to trust throughout the ecosystem. This will help protect security and privacy as well as help bring criminals to justice when electronic malfeasance occurs. In sum, the opportunity exists to create a trusted, privacy-enhanced Internet.
My take: This passage appears on page 4. I can already hear the hackles and the privacy vs. tracking issue is huge.
Microsoft says:
Current strategy does not address effectively the most important issue: a globally connected, anonymous, untraceable Internet with rich targets is a magnet for criminal activity — criminal activity that is undeterred due to a lack of accountability. Moreover, the Internet also fails to provide the information necessary to permit lawful computer users to know whether the people they are dealing with, the programs they are running, the devices they are connecting to, or the packets they are accepting are to be trusted.
My take: Hard to argue with this one, but what entities will secure these identity items.
Microsoft says:
Although trust may be a complex issue, this does not alter the fact that certain foundational elements must be in place to create a more trustworthy environment. The most important element is an authenticated identity attribute (e.g., name, age or citizenship); absent the ability to authenticate a person (or a personal attribute), machine, software, and/or data — and absent the ability to combine that authenticated data with other trust information (e.g., prior experience, reputation), effective trust decisions cannot be made. Second, absent the ability to identify and prove the source of misconduct, there can be no effective deterrent — no effective law enforcement response to cybercrime and no meaningful political response to address international issues relating to cyber-abuse. To date, the “response” to computer abuse of all types has been to increase defenses, but the history of computer security shows that offense will beat defense in cyberspace because attackers have an abundance of time and resources, and may only need to find one weakness, whereas a defender must cover all avenues of attack. Experience shows that most cybercriminal schemes are successful because people, machines, software and data are not well authenticated and this fact, combined with the lack of auditing and traceability, means that criminals will neither be deterred at the outset nor held accountable after the fact. Thus the answer must lie in better authentication that allows a fundamentally more trustworthy Internet and audit that introduces real accountability.
My take: Complex issue indeed. Will this audit be real-time? And what are the chances of this identity scheme actually tracking down a criminal? It’s not these folks will fork over any identity attributes. What happens if the identity isn’t in the U.S.?
Microsoft says:
We must create an environment where reasonable and effective trust decisions can be made. We must also create an environment where accountability — and therefore deterrence — can be achieved. To do this, one must have access to a “trusted stack”: (1) security rooted in the hardware; (2) a trusted operating system; (3) trusted applications; (4) trusted people; and (5) trusted data. The entire stack must be trustworthy because these layers can be interdependent, and a failure in any can undermine the security provided by the other layers; for example, a document may be created by an identified individual, using secure hardware and a secure operating system, and sent to another as a signed attachment with integrity, but if it was created with an insecure application, it may not be trustworthy.
My take: This approach is more secure for sure. It doesn’t sound user friendly at all. Can it be automated?
Microsoft says:
First, nothing in this paper is meant to suggest that anonymity on the Internet be abolished. To the contrary, anonymity should be preserved and enhanced through both technology and social policy…Second, nothing in this paper is meant to create unique, national identifiers, even if some countries are creating identity systems that do so…Third, nothing in this paper supports the creation of mega-databases that collect personal information…Fourth, there is no claim that creating an authenticated, audited environment has no impact on privacy…Fifth, any system can be abused and, if the risk of serious abuse is significant enough, then we might eschew the approach…Finally, universal buy-in and implementation is not necessary to achieve a modicum of success.
My take: This part of the whitepaper is the one where folks will freak out. Microsoft may have built a lot of security goodwill, but this stream of identity thoughts from the software giant are guaranteed to raise concerns. Why? The concept is coming from Microsoft and no amount of disclaimers will prevent conspiracy theories.
Microsoft says:
What benefits arise from the fact that people, devices, software and data are more robustly authenticated and their activities audited? In a general sense, the most obvious benefit of authentication is that it empowers better trust decisions. Auditing creates a better ability to hold people accountable for misconduct, and thereby deter such conduct, assuming that domestic cybercrime laws and international cooperation mechanisms are sufficient. Enabling better trust decisions and accountability will solve specific real-world problems. For example, a well-audited transaction between two authenticated parties serves to protect both sides of the transaction. A bank could more easily authenticate a customer’s identity, a customer would have greater assurance that the Web site that he or she was visiting was that of the bank, and both parties could determine what truly happened if any issue arose. By conducting device-to-device authentication, organizations could reduce the number of external hackers with access to their systems, in large part because a hacker would need access to an “authorized” machine to connect to the victim’s network. In addition, if an unauthorized access were to occur and better auditing records proved what happened, it would become much easier to apply physical-world mechanisms (e.g., law enforcement, political forces) to address cybercrime, economic espionage and information warfare. Because these mechanisms enable more effective trust decisions to be made throughout the ecosystem — by and about people, devices, software and data — we call this End-to-End Trust.
My take: Sounds like a security standard spat on deck.
Microsoft outlines the components of a trusted stack:
Because all software operates in an environment defined by hardware, it is critical to root trust in hardware. Today, many computers come with a Trusted Platform Module (TPM), a technology that will expand and enter new form factors…The operating system must be verifiable based upon keys stored in the hardware (e.g., “trusted boot”). This allows the device to claim that the operating system has not been tampered with to bad effect…Computers were, of course, designed to run code, without concern about its authorship or the intent of that author. Today there are multiple ways to help protect people from software vulnerabilities and malicious code. To protect users from vulnerabilities, code can be rewritten in safer languages, checked with analytic tools, compiled with compilers that reduce vulnerabilities (e.g., buffer overruns) and sandboxed when executed…A safer Internet needs to support the option of identities based directly or derivatively upon in-person proofing, thus enabling the issuance of credentials that do not depend upon the possession of a shared secret by the person whose identity or identity attribute is being verified. To some extent, government activities and markets themselves are driving in-person-proofing regimes…Applications should incorporate seamless mechanisms for applying signatures to their outputs, and read signatures before opening documents, so that data origin and data integrity can be easily checked….An audit trail is a record of a sequence of events from which a history may be reconstructed. An audit log is a set of data collected over a period of time for a specific component. A series of audit logs can be studied to determine a pattern of system usage that, over time, can be used to highlight aberrant behavior such as criminal activity or the existence of malware. Audit data is also necessary to roll back suspicious or harmful transactions.
My take: Trusted hardware and operating systems are a no-brainer and probably doable. Identities and the audit trail are much trickier. How would this stack work in practice? And is there a performance hit?
Microsoft Calls For Broad Dialogue On Internet Trust
http://www.informationweek.com/news/security/app_security/showArticle.jhtml?articleID=207100148
Research officer Craig Mundie's proposal includes the creation of a trusted computing stack in which software, hardware, people, and data can be authenticated.
By Thomas Claburn
InformationWeek
April 8, 2008 12:45 PM
At the 2008 RSA Conference in San Francisco on Tuesday, Microsoft chief research and strategy officer Craig Mundie called for a wide-ranging discussion about creating a more trustworthy Internet.
As a first step, Microsoft (NSDQ: MSFT) published a call-to-action for the technology industry that proposes the necessary elements for establishing a more secure and trustworthy environment online. The white paper detailing Microsoft's plan, "Establishing End to End Trust," was written by Scott Charney, the company's corporate VP of trustworthy computing.
Microsoft has also established an online forum where those concerned about security and privacy on the Internet can participate in the discussion.
The vision articulated by Microsoft encompasses the creation of a trusted computing stack in which software, hardware, people, and data can be authenticated. It imagines "a system that enables people to preserve their identity claims while addressing issues of authentication, authorization, access, and audit." And it seeks closer alignment of Internet stakeholders as a means to make progress, an aspiration that implicitly acknowledges the daunting task of rebuilding trust online.
Microsoft is aware of the difficulties of rewriting the rules of the Internet, but it contends something has to be done. "[S]taying the current course will not be sufficient; the real issue is that the current strategy does not address effectively the most important issue: a globally connected, anonymous, untraceable Internet with rich targets is a magnet for criminal activity -- criminal activity that is undeterred due to a lack of accountability," Charney explains in his white paper. "Moreover, the Internet also fails to provide the information necessary to permit lawful computer users to know whether the people they are dealing with, the programs they are running, the devices they are connecting to, or the packets they are accepting are to be trusted."
"We believe that End to End Trust will transform how the industry thinks about and approaches online trust and security," said Mundie in prepared remarks. "Our end goal is a more secure and trustworthy Internet, but it's also important that we give people the tools to empower them to make good trust choices. End to End Trust will enable new opportunities for collaboration on solutions to social, political, economic, and technical issues that will have a long-term impact on Internet security and privacy."
Perhaps wary of the blowback that followed its 2001 introduction of its "Hailstorm" identity database service (which withered a year later because other companies didn't want Microsoft authenticating its customers), Microsoft is providing more detail about what its proposal is not than what it is.
Charney makes it clear that Microsoft is not calling for an end to anonymity, a new national identification scheme, or a mega-database of personal information.
At the same time, Charney acknowledges that Microsoft's vision will have some impact on privacy, that abuse of a more authenticated environment may still happen, and that universal buy-in isn't necessary to make the Internet more trustworthy.
Kurt Roemer, chief security strategist for Citrix Systems (NSDQ: CTXS), in a statement acknowledged that being able to assess trustworthiness online remains a key concern for organizations and consumers. "It's time for a global collaborative effort to define and support an actionable end-to-end trust model that can help balance the often competing interests of privacy and security," he said.
The question is whether a Microsoft-driven initiative can thrive despite the competing interests of competitors, or whether any such effort, however seemingly well-intentioned, is doomed by technological partisanship and conflicting agendas.
But in taking such a hat-in-hand approach, in asking for consensus-building rather than trying to impose a branded technical solution, Microsoft manages to make such a question seem petty, like arguing over whether red or blue buckets should be used to bail water out of the sinking ship that is the Internet.
Charney doesn't quite put it that way. He asks, "As we become increasingly dependent on the Internet for all our daily activities, can we maintain a globally connected, anonymous, untraceable Internet and be dependent on devices that run arbitrary code of unknown provenance?"
Answering his own question as if there were still some question about the answer, Charney continues, "If the answer to that is 'no,' then we need to create a more authenticated and audited Internet environment, one in which people have the information they need to make good trust choices."
In other words, we need to create a more authenticated and audited Internet environment.
In a phone interview prior to Mundie's address, Steve Lipner, senior director of security engineering strategy of Microsoft's Trustworthy Computing group, discussed Mundie's planned remarks and how much the security of Microsoft's products had improved in the six years since its Trustworthy Computing initiative began. The security of Microsoft's products isn't perfect, he said, because that isn't possible. But they are now on a path of continuous improvement.
Although the vulnerability of Microsoft's software has declined, Lipner said, the shift toward sophisticated targeted attacks and social engineering shows that there's more to be done. "While there's some comfort the products are getting secure, there's still concern that customers aren't safe on the Net," he said.
As an example of how the Internet might work if other major stakeholders buy into Microsoft's vision, Lipner pointed to Web sites for children. "If you have children-only Web sites, how do you know that the children-only Web site is in fact for children only?" he said. "With stronger authentication and a trusted stack, we get to the idea of in-person proofing."
The idea, a safer Internet, certainly sounds appealing. But the devil is in the details. In all likelihood, Microsoft will be providing updates on its End to End Trust proposal at the 2009 RSA Conference, and in the years that follow, for quite some time. "This is a launch of a long term initiative that we think will bear fruit over time, but is very important in improving people's trust in the Internet," said Lipner.
Encryption Solutions Get Boost from Data Breaches
http://business.newsfactor.com/story.xhtml?story_id=013001BYG3YM&page=1
Trusted Computing Group:
http://news.ecoustics.com/bbs/messages/10381/470816.html
"Full drive encryption enabled in hardware and based on the open standards created by the Trusted Computing Group with Seagate's leadership can give administrators and users confidence that data will be encrypted quickly, easily and always," noted Brian Berger, Trusted Computing Group marketing work group chair. "As demonstrated by the rapidly increasing number of lost, stolen or hacked drives, encryption in hardware really is the most effective solution to help ensure the security of at-rest mission-critical information. Otherwise, corporations potentially face millions or worse in government fines, lost business, lost goodwill and undermining of other corporate relationships."
Check out this link:
http://fdesecurityleaders.com/
Self-Encrypting Disk Drives to Move into Data Centers
http://www.podtech.net/classic/5063/self-encrypting-disk-drives-to-move-into-data-centers
Seagate, IBM, and LSI have been working to make changes in how data is protected in enterprise Data Centers. Today Seagate announced it is making available the world's first self-encrypting hard drive for data centers. PodTech spoke with key executives at Seagate, IBM, and LSI about this new technology shift, how it works and why enterprise data center managers should know more about the next phase in enterprise security.
In this first of three podcasts, PodTech's Michael Johnson speaks with Sherman Black, senior vice president and general manager of the enterprise compute business at Seagate. Black explains how Full Disk Encryption (FDE) is crucial for enterprise data storage security, as all drives eventually leave the data center, whether for repair, retirement, or maintenance
Player, not what I show on time and sales:
04/03 4:00:04p O 0.9400 0.9600 10X204 Close