InvestorsHub Logo
icon url

jakes_dad

09/17/10 8:26 PM

#199220 RE: internet #199219

Nice follow up Internet

Good to see SKS set the record straight.....


3 responses so far ?
1 steven sprague September 17, 2010 at 2:53 pm

John,
I enjoyed your panel, but it is unfortunate that your schedule didn’t allow you to attend the whole trusted computing conference to see more of the technology demonstrations and commercial solutions available, and talk with end users that have put trusted computing to work in their organizations. The content was new, very rich and quite compelling.

While it is true that TPMs have been out for many years it takes time for hardware technology to reach a tipping point of an installed base. I think we witnessed that tipping point this week. For many it was an introduction to “What the Heck is Trusted Computing anyway?” With 350 Million Plus TPMs in distribution and 10’s of millions of new devices shipping each year this is a strong and growing deployment of technology.

There were a couple of Primary messages that came out of the conference and I think they are important for you, and for Gartner’s clients, to consider.

We need to move to hardware security, and trusted computing is providing the basic building block to do that.

Step 1. Only Known devices should be connected to Sensitive Networks and Sensitive Data. In this context “Known” is known by the owner of the network or data. For example downloading the entire medical database onto the kid’s laptop is probably a bad idea unless the kid’s laptop has been registered with the network, and has an approved data protection solution. TPM was clearly articulated as the solution for this type access control as it is already deployed, inexpensive and ready to be used. A case study was presented by PricewaterhouseCoopers sharing how they’ve deployed over 30K users and plan to roll it out to over 150K users. A great quote from the presenter when asked What have the users thought about the change and the comment was, “they have not noticed any difference.” Greater security with no added complexity is a good thing. Hardware-based authentication was a central message across the 3 day conference.

Step 2. Use hardware-based data protection. This was embodied in the use of the Trusted Computing Group OPAL specification self-encrypting drives. Seagate announced it’s FIPS 140-2 Certification and Data at Rest was broadly discussed. There were a number of demonstrations of this technology, and again the desire clearly stated to move to a hardware base for data protection.

Step 3. Trusted Network Connect – Leveraging the attestation capabilities of the TPM to assure that the device being connected is configured as expected for the services that are required. Using hardware-based attestation is critical to prevent Known attacks. This was clearly demonstrated in a video that was shown on the first day, and was promised to be made available on the Web. Every enterprise needs to see this once it is published. It showed how hardware-based attestation to the state of a device could prevent a number of significant attack vectors. By verifying that the software has not been altered.

Step 4. Hardware assured Virtualization or High Assurance Platform. The combination of all of the above to provide the enterprise with a mixed-use device, where the user can SAFELY do what they want and not alter the stuff they don’t want screwed up.

I agree with you John, that we want open programmable platforms we can download code on, and do what ever the heck we want. BUT we also want to access our checking account with software that cannot be compromised or generate any personal RISK. This is the basis of trusted computing — How do I create truly secure sand boxes that will not allow bad stuff to take over the whole computer and yet still let the user have the flexibility they need. The industry has responded very well to the concerns of privacy and control, and has built a set of tools that can be implemented very broadly for very simple uses, and highly-complex applications.

The only place that I take exception to your comments is that somehow trusted computing restricts the users’ choices. Today the lack of security is hugely restricting our uses of technology. How many households have purchased a separate PC just for their small business or banking as they don’t want it to be compromised? How many businesses restrict employees use of tech, specifically social media, because of the risk of infection and malware? How many children are put at risk every day because parents don’t have the tools, and how many jobs are lost because of stolen IP, product plans, marketing events, and financial data that are actively being compromised by other Nations and Rouge companies?

As was clearly stated at the conference it is time for a new approach. One that is based on hardware. One that is based on standards, and one the maximizes the capability of the technology we have.

Steven Sprague
2 John Pescatore September 17, 2010 at 3:48 pm

Actually, I was at the conference on Wednesday and did attend several sessions and spend time at the vendor exhibits.

I really don’t disagree with any of your points – except the part about the Trusted Computing approach restricting user choice *less*. There is no doubt that that hardware attestation, only allowing trusted software to run, etc. increases security. However, almost all that could have been achieved previously by not allowing users to have any admin rights and by locking down OS configurations. But most government agencies (including most military) in the unclassified world found that approach to be *too* restrictive. It impacted productivity such that the damage to the mission was equivalent to the impact of many threats.

As several of the attendees asked: “Why are there 350M TPMs out there and less than 1% are actually used?” The commercial vendors said “lack of demand, driven by complexity and fear of compromise of privacy”. Those are issues that need to be dealt with – especially as we see consumerization changing the IT problem once again.

Just as the MLS world in the mid 1980s was aiming at departmental computing just as personal computing was dislodging it, the TPM centric approach is aiming at locked-down hardware just as enterprise computing is moving away from that. I brought that up on the panel and there was discussion of the difficulties of trying to extend out to consumer devices – more about complexity, cost and fear of compromise of privacy.

There are definitely advantages to all for standards and there are definitely security advantages for hardware-based security for high assurance and high control environments. However, trying to translate over to the commercial market requires addressing the pressures of mainstream enterprises, as well.
3 steven sprague September 17, 2010 at 5:20 pm

It will be interesting to see how the forces get changed.

TPM has complexity to protect privacy
Mobile devices generally don’t

As enterprises broadly addopt non PC device will privacy become a non issue or will controls need to be added that are similar to TPM. Really only time will tell. TPM may get enough addoption that the expectation of Isolation of identities will then carry over to Mobile. We have seen these privacy forces at work in this direction already in the OIX rules for open ID.

On consumer choice what we really want is a really good sandbox. On that I think we agree. This will alow the consumer and corporate PC with two admins or moms PC and the kids PC the problems are similar. Trusted Computing technologies are the curent technologies to make that happen with some level of trust. From simple banding in OPAL Drives to High Assurance Platform. The competition is to carry 3-4 devices time will tell.

Still the first step is move all existing software Certs to hardware it is easy and it does not change the scope of enterprise managment or the user experience but it does change the security of the key.
icon url

wavxmaster

09/17/10 8:28 PM

#199221 RE: internet #199219


Internet

Thanks. I think many should read this::

What the Heck is Trusted Computing Anyway?by John Pescatore | September 17, 2010 | 3 Comments
Yesterday I chaired a (mostly) private industry panel on the closing day at NSA’s Trusted Computing conference. The topic was “Key Drivers and Trends of Trusted Computing.” The panel consisted of:

•Ernie Brickell – Intel
•Paul England – Microsoft
•Ron Perez – AMD
•Mark Schiller – HP
•Bill Burr -NIST
The audience was primarily folks from government agencies and defense contractors who have been working in Multilevel Security, High Assurance Processing and Cross Domain security for many years. (For an interesting and humorous perspective on the conference from a savvy security industry reporter, see Ellen Messmer’s Network World piece here.) We had a panel discussion for the first 20 minutes or so and I then I opened it up to audience discussion – which went on for the next hour non-stop.

A lot of the audience questions were basically poking the private industry panelists about why they haven’t just built trusted computing into their products. It was quickly obvious (as Ron Perez noted) that one major issue was the definition of trusted computing.

The Trusted Computing Group definition is basically: “With Trusted Computing, the computer or system will consistently behave in specific ways, and those behaviors will be enforced by hardware and software when the owner of those systems enables these technologies.” This is typically interpreted by most government security folks as “With Trusted Computing we will finally be able to force the user to only run the software we want them to run on the hardware we want them to use – that will put an end to malware.”

However, in the commercial environment trends like consumerization are pushing in the opposite direction – more and more frequently business will be done from hardware running applications where both are chosen by the user. Trusted Computing that focuses on user lockdown is aiming way, way behind the duck.

That’s why I’ve been calling this area “Trustable Computing” since 2003. Back then I wrote a Gartner Research Note “Three Scenarios for Trustable Computing Platforms” when the initial TCG concepts and industry approaches (like the TPM chip) were being defined. The mostly likely scenario I defined is exactly what has happened:

V-Chip Scenario — PC hardware is updated to include security chips, and the Windows kernel is modified to include NGSCB. However, consumers and the overall market ignore the trusted execution features, similar to how they ignore V-chip technology (see “P3P Will Be the V-Chip of the Internet”). Most end users (and IS managers) will have used the Internet for more than 10 years by 2008, when trustable computing platforms will emerge. These users resist more-stringent controls on content and use their market power to reward content providers that provide more-open access. The net result is limited use of the technology by enterprises, as well as little consumer adoption.

As many in the NSA conference audience kept pointing out “There are 350 million PCs with TPM chips in them, why doesn’t private industry force them to be used??” The reason is all the benefits are focused on restricting the user, not letting users stay safe doing what they need to do to get the job done.

The key to this is the definition of “owner” in the TCG definition above. “Owner” should really be “user” – it doesn’t matter who bought the platform, the performance should be predictable and safe for the user.

3 Comments »

Category: Uncategorized Tags:

Share Buzz3 responses so far ?
1 steven sprague September 17, 2010 at 2:53 pm

John,
I enjoyed your panel, but it is unfortunate that your schedule didn’t allow you to attend the whole trusted computing conference to see more of the technology demonstrations and commercial solutions available, and talk with end users that have put trusted computing to work in their organizations. The content was new, very rich and quite compelling.

While it is true that TPMs have been out for many years it takes time for hardware technology to reach a tipping point of an installed base. I think we witnessed that tipping point this week. For many it was an introduction to “What the Heck is Trusted Computing anyway?” With 350 Million Plus TPMs in distribution and 10’s of millions of new devices shipping each year this is a strong and growing deployment of technology.

There were a couple of Primary messages that came out of the conference and I think they are important for you, and for Gartner’s clients, to consider.

We need to move to hardware security, and trusted computing is providing the basic building block to do that.

Step 1. Only Known devices should be connected to Sensitive Networks and Sensitive Data. In this context “Known” is known by the owner of the network or data. For example downloading the entire medical database onto the kid’s laptop is probably a bad idea unless the kid’s laptop has been registered with the network, and has an approved data protection solution. TPM was clearly articulated as the solution for this type access control as it is already deployed, inexpensive and ready to be used. A case study was presented by PricewaterhouseCoopers sharing how they’ve deployed over 30K users and plan to roll it out to over 150K users. A great quote from the presenter when asked What have the users thought about the change and the comment was, “they have not noticed any difference.” Greater security with no added complexity is a good thing. Hardware-based authentication was a central message across the 3 day conference.

Step 2. Use hardware-based data protection. This was embodied in the use of the Trusted Computing Group OPAL specification self-encrypting drives. Seagate announced it’s FIPS 140-2 Certification and Data at Rest was broadly discussed. There were a number of demonstrations of this technology, and again the desire clearly stated to move to a hardware base for data protection.

Step 3. Trusted Network Connect – Leveraging the attestation capabilities of the TPM to assure that the device being connected is configured as expected for the services that are required. Using hardware-based attestation is critical to prevent Known attacks. This was clearly demonstrated in a video that was shown on the first day, and was promised to be made available on the Web. Every enterprise needs to see this once it is published. It showed how hardware-based attestation to the state of a device could prevent a number of significant attack vectors. By verifying that the software has not been altered.

Step 4. Hardware assured Virtualization or High Assurance Platform. The combination of all of the above to provide the enterprise with a mixed-use device, where the user can SAFELY do what they want and not alter the stuff they don’t want screwed up.

I agree with you John, that we want open programmable platforms we can download code on, and do what ever the heck we want. BUT we also want to access our checking account with software that cannot be compromised or generate any personal RISK. This is the basis of trusted computing — How do I create truly secure sand boxes that will not allow bad stuff to take over the whole computer and yet still let the user have the flexibility they need. The industry has responded very well to the concerns of privacy and control, and has built a set of tools that can be implemented very broadly for very simple uses, and highly-complex applications.

The only place that I take exception to your comments is that somehow trusted computing restricts the users’ choices. Today the lack of security is hugely restricting our uses of technology. How many households have purchased a separate PC just for their small business or banking as they don’t want it to be compromised? How many businesses restrict employees use of tech, specifically social media, because of the risk of infection and malware? How many children are put at risk every day because parents don’t have the tools, and how many jobs are lost because of stolen IP, product plans, marketing events, and financial data that are actively being compromised by other Nations and Rouge companies?

As was clearly stated at the conference it is time for a new approach. One that is based on hardware. One that is based on standards, and one the maximizes the capability of the technology we have.

Steven Sprague
2 John Pescatore September 17, 2010 at 3:48 pm

Actually, I was at the conference on Wednesday and did attend several sessions and spend time at the vendor exhibits.

I really don’t disagree with any of your points – except the part about the Trusted Computing approach restricting user choice *less*. There is no doubt that that hardware attestation, only allowing trusted software to run, etc. increases security. However, almost all that could have been achieved previously by not allowing users to have any admin rights and by locking down OS configurations. But most government agencies (including most military) in the unclassified world found that approach to be *too* restrictive. It impacted productivity such that the damage to the mission was equivalent to the impact of many threats.

As several of the attendees asked: “Why are there 350M TPMs out there and less than 1% are actually used?” The commercial vendors said “lack of demand, driven by complexity and fear of compromise of privacy”. Those are issues that need to be dealt with – especially as we see consumerization changing the IT problem once again.

Just as the MLS world in the mid 1980s was aiming at departmental computing just as personal computing was dislodging it, the TPM centric approach is aiming at locked-down hardware just as enterprise computing is moving away from that. I brought that up on the panel and there was discussion of the difficulties of trying to extend out to consumer devices – more about complexity, cost and fear of compromise of privacy.

There are definitely advantages to all for standards and there are definitely security advantages for hardware-based security for high assurance and high control environments. However, trying to translate over to the commercial market requires addressing the pressures of mainstream enterprises, as well.
3 steven sprague September 17, 2010 at 5:20 pm

It will be interesting to see how the forces get changed.

TPM has complexity to protect privacy
Mobile devices generally don’t

As enterprises broadly addopt non PC device will privacy become a non issue or will controls need to be added that are similar to TPM. Really only time will tell. TPM may get enough addoption that the expectation of Isolation of identities will then carry over to Mobile. We have seen these privacy forces at work in this direction already in the OIX rules for open ID.

On consumer choice what we really want is a really good sandbox. On that I think we agree. This will alow the consumer and corporate PC with two admins or moms PC and the kids PC the problems are similar. Trusted Computing technologies are the curent technologies to make that happen with some level of trust. From simple banding in OPAL Drives to High Assurance Platform. The competition is to carry 3-4 devices time will tell.

Still the first step is move all existing software Certs to hardware it is easy and it does not change the scope of enterprise managment or the user experience but it does change the security of the key.
icon url

player1234

09/17/10 9:08 PM

#199223 RE: internet #199219

What agenda and prejudice do you refer to regarding Gartner?


I thought the key line in the original post was "The reason is all the benefits are focused on restricting the user, not letting users stay safe doing what they need to do to get the job done." However, I'm not sure if that is where your focus is.
icon url

samk

09/18/10 9:38 AM

#199248 RE: internet #199219

internet, here is full text of Steven's response from the blog you posted:

John,
I enjoyed your panel, but it is unfortunate that your schedule didn’t allow you to attend the whole trusted computing conference to see more of the technology demonstrations and commercial solutions available, and talk with end users that have put trusted computing to work in their organizations. The content was new, very rich and quite compelling.

While it is true that TPMs have been out for many years it takes time for hardware technology to reach a tipping point of an installed base. I think we witnessed that tipping point this week. For many it was an introduction to “What the Heck is Trusted Computing anyway?” With 350 Million Plus TPMs in distribution and 10’s of millions of new devices shipping each year this is a strong and growing deployment of technology.

There were a couple of Primary messages that came out of the conference and I think they are important for you, and for Gartner’s clients, to consider.

We need to move to hardware security, and trusted computing is providing the basic building block to do that.

Step 1. Only Known devices should be connected to Sensitive Networks and Sensitive Data. In this context “Known” is known by the owner of the network or data. For example downloading the entire medical database onto the kid’s laptop is probably a bad idea unless the kid’s laptop has been registered with the network, and has an approved data protection solution. TPM was clearly articulated as the solution for this type access control as it is already deployed, inexpensive and ready to be used. A case study was presented by PricewaterhouseCoopers sharing how they’ve deployed over 30K users and plan to roll it out to over 150K users. A great quote from the presenter when asked What have the users thought about the change and the comment was, “they have not noticed any difference.” Greater security with no added complexity is a good thing. Hardware-based authentication was a central message across the 3 day conference.

Step 2. Use hardware-based data protection. This was embodied in the use of the Trusted Computing Group OPAL specification self-encrypting drives. Seagate announced it’s FIPS 140-2 Certification and Data at Rest was broadly discussed. There were a number of demonstrations of this technology, and again the desire clearly stated to move to a hardware base for data protection.

Step 3. Trusted Network Connect – Leveraging the attestation capabilities of the TPM to assure that the device being connected is configured as expected for the services that are required. Using hardware-based attestation is critical to prevent Known attacks. This was clearly demonstrated in a video that was shown on the first day, and was promised to be made available on the Web. Every enterprise needs to see this once it is published. It showed how hardware-based attestation to the state of a device could prevent a number of significant attack vectors. By verifying that the software has not been altered.

Step 4. Hardware assured Virtualization or High Assurance Platform. The combination of all of the above to provide the enterprise with a mixed-use device, where the user can SAFELY do what they want and not alter the stuff they don’t want screwed up.

I agree with you John, that we want open programmable platforms we can download code on, and do what ever the heck we want. BUT we also want to access our checking account with software that cannot be compromised or generate any personal RISK. This is the basis of trusted computing — How do I create truly secure sand boxes that will not allow bad stuff to take over the whole computer and yet still let the user have the flexibility they need. The industry has responded very well to the concerns of privacy and control, and has built a set of tools that can be implemented very broadly for very simple uses, and highly-complex applications.

The only place that I take exception to your comments is that somehow trusted computing restricts the users’ choices. Today the lack of security is hugely restricting our uses of technology. How many households have purchased a separate PC just for their small business or banking as they don’t want it to be compromised? How many businesses restrict employees use of tech, specifically social media, because of the risk of infection and malware? How many children are put at risk every day because parents don’t have the tools, and how many jobs are lost because of stolen IP, product plans, marketing events, and financial data that are actively being compromised by other Nations and Rouge companies?

As was clearly stated at the conference it is time for a new approach. One that is based on hardware. One that is based on standards, and one the maximizes the capability of the technology we have.

Steven Sprague
2 John Pescatore September 17, 2010 at 3:48 pm

Actually, I was at the conference on Wednesday and did attend several sessions and spend time at the vendor exhibits.

I really don’t disagree with any of your points – except the part about the Trusted Computing approach restricting user choice *less*. There is no doubt that that hardware attestation, only allowing trusted software to run, etc. increases security. However, almost all that could have been achieved previously by not allowing users to have any admin rights and by locking down OS configurations. But most government agencies (including most military) in the unclassified world found that approach to be *too* restrictive. It impacted productivity such that the damage to the mission was equivalent to the impact of many threats.

As several of the attendees asked: “Why are there 350M TPMs out there and less than 1% are actually used?” The commercial vendors said “lack of demand, driven by complexity and fear of compromise of privacy”. Those are issues that need to be dealt with – especially as we see consumerization changing the IT problem once again.

Just as the MLS world in the mid 1980s was aiming at departmental computing just as personal computing was dislodging it, the TPM centric approach is aiming at locked-down hardware just as enterprise computing is moving away from that. I brought that up on the panel and there was discussion of the difficulties of trying to extend out to consumer devices – more about complexity, cost and fear of compromise of privacy.

There are definitely advantages to all for standards and there are definitely security advantages for hardware-based security for high assurance and high control environments. However, trying to translate over to the commercial market requires addressing the pressures of mainstream enterprises, as well.
3 steven sprague September 17, 2010 at 5:20 pm

It will be interesting to see how the forces get changed.

TPM has complexity to protect privacy
Mobile devices generally don’t

As enterprises broadly addopt non PC device will privacy become a non issue or will controls need to be added that are similar to TPM. Really only time will tell. TPM may get enough addoption that the expectation of Isolation of identities will then carry over to Mobile. We have seen these privacy forces at work in this direction already in the OIX rules for open ID.

On consumer choice what we really want is a really good sandbox. On that I think we agree. This will alow the consumer and corporate PC with two admins or moms PC and the kids PC the problems are similar. Trusted Computing technologies are the curent technologies to make that happen with some level of trust. From simple banding in OPAL Drives to High Assurance Platform. The competition is to carry 3-4 devices time will tell.

Still the first step is move all existing software Certs to hardware it is easy and it does not change the scope of enterprise managment or the user experience but it does change the security of the key.
4 ordaj September 17, 2010 at 8:38 pm

John,

“…only allowing trusted software to run…”

I don’t think Trusted Computing is trying to achieve this. If a high-assurance entity (your bank, the military, your employer) want to be sure you’re accessing their assets with a trusted and uncompromised piece of software, I think that is a reasonable thing. But if you also want to run other programs I don’t believe it restricts you. If you want to access Facebook, unless Facebook requires you to have a trusted connection, then you can access it without restriction.
Leave a Comment