Blog archive - August 2012
Use the blog to discuss and comment on the latest industry insights provided by our analyst experts.
by Jeanine Sterling 30 Aug 2012
| Add Comment
In an earlier blog post, we shared survey data that had a full 82% of North American businesses already having at least one mobile application deployed to their employees’ handheld devices. Looking forward over the next twelve months (mid-2012 to mid-2013), 68% of businesses plan to implement one or more additional applications. 9% of the total respondents actually expect to introduce more than ten new solutions during that twelve-month period. This is obviously a growing market. But just which applications are getting the attention? And which are not? The research cited was conducted earlier this year by Frost & Sullivan, surveying 300 mobile and wireless decision-makers in the North American business sector. Both Canadian and U.S. companies and organizations were included in this research. The percentage of respondents reporting that they have already deployed a particular mobile employee-facing software solution (whether as a trial or a full-blown implementation) are as follows: 80% - Wireless email 48% - Access to internal databases 44% - Standalone corporate instant messaging 38% - Mobile sales force automation 37% - Employee-to-employee social media 35% - Mobile workforce management 33% - Machine-to-machine remote monitoring and diagnostics 29% - Mobile asset tracking 27% - Standalone video capture 13% - Fleet tracking and management Typically, around half of those companies that have already deployed also anticipate expanding their implementations within the following year. There is also a segment of respondents who are sold on the app’s value proposition, but have not yet gotten around to implementing. These “planners” report that they will be introducing the app for the first time within the next three years. Adding these planner opportunities to those current users who state they will be expanding within the next year, the new deployments opportunity during 2012-2015 can be ranked as follows: Mobile sales force automation Access to internal databases Wireless email Mobile workforce management Mobile asset tracking Standalone corporate instant messaging Machine-to-machine remote monitoring and diagnostics Employee-to-employee social media Standalone video capture Fleet tracking and management Additional survey results are included in our upcoming study: “2012 Mobile Enterprise Applications: Opportunities Within Enterprises in North America,” NB68-65.
by Chris Rodriguez 29 Aug 2012
| Add Comment
It is entirely possible that the Black Hat Conference was set in Las Vegas simply as an ironic statement about the state of information security today. Everyone is gambling. For hackers, the buy-in cost to attack a massive number of websites through simple Web scripts, spam, and malware is minimal. But with a slightly larger investment of time and resources, hackers can target large organizations for much bigger jackpots. Consequently, enterprise organizations are “all-in” whether they want to be or not. The stakes are high, as hackers conducting targeted attacks are not stopping at personal identification information (PII) and are now going for the “secret sauce” recipes. Security breaches that leak state secrets and intellectual property are difficult to measure in dollars but have major sociopolitical implications that can shape the modern world. Worse still, hackers and cyber criminals have stacked the deck and are able to penetrate network defenses through targeted attacks seemingly at will. So, All is Lost? For 2012, the prevailing tone underlying every major industry discussion has been defeatism. In the Keynote address given by Shawn Henry, former FBI-guy extraordinaire, we were told that we are all already breached and to accept this fact. Now, I understand the psychology here: that if we convince ourselves that there are malicious actors already in our system then we will be that much more motivated to go find them. However, a paranoid hunt for a phantasm hacker cannot possibly be an efficient use of very limited resources. On the other hand this defeatist mentality has led other security professionals nearly to the point of surrender. This post calls for companies to cease security awareness training for end-users and to focus instead on security technologies. While it is important to prioritize efforts to focus on the security strategies that are most effective, the urge to give up on the human element is short-sighted. Training at all levels is an essential pillar in any security architecture, and I would argue that training requirements should actually increase for end-users higher up the management ladder. Who Should Take Responsibility Henry also indicated that the government was not interested in (or capable of) protecting the private sector. The government considers information security to be the responsibility of each individual business. The idea is that businesses must ward off thieves and burglars in the physical world and should do so in the cyber world. In the very next briefing, Marcus Ranum voiced his concern and distaste for this notion, pointing out that the government should try to defend private businesses as well because the most sophisticated attacks are often perpetrated by nation states and foreign criminal organizations. These organizations could not conduct physical attacks against businesses without recourse, but are free to do so online. (Henry also said that we should take the fight to the enemy as well, which is clearly not viable or even advisable depending on the attacker. But at that point I was certain he was just jumping the shark for entertainment value). (Pictured: The Black Hat signal.) Unfortunately, many businesses do not realize that they are breached until a business partner reports anomalous behavior from the victim’s systems. The FBI and other government organizations will only contact victims once the damage is done. The government is interested in finding and eliminating hacking groups which is admirable but does little to protect businesses from their attacks in the meantime. Where Do We Go Now? The point is: the threat is greater than ever. Businesses must implement the foundational layers of security, including next generation firewall, IPS, content filtering, and anti-virus. Leading security vendors such as Check Point and McAfee have been adding new features such as advanced anti-malware, anti-bot technologies, DLP, and threat correlation to better defend customers. Businesses should invest in vulnerability management solutions from innovative vendors such as Qualys and BeyondTrust (acquired eEye). SMB companies can receive nearly all of these capabilities from UTM solutions, as UTM vendors including Fortinet and WatchGuard license technologies such as anti-virus and IPS from leading vendors such as Kaspersky. In addition, they also maintain a high level of internal product and feature development. Thus, there is no excuse for businesses to lack these essential technologies. Additionally, new technologies are emerging that will empower businesses to block a large number of threats. Companies such as ClickSecurity and Critical Watch use advanced analytics and security intelligence to identify the most sophisticated and targeted attacks. Considering the massive amounts of log data generated in enterprise organizations and attackers’ skill at remaining undetected, these real-time tools are absolutely necessary to find the most persistent security breaches. Most importantly, companies must dedicate budget to improve the “human factor” through training, auditing, penetration testing, and analysis. As far as penetration testing goes, everyone should be doing it on a regular basis. Companies that don’t engage in auditing and penetration testing have no insight into their true security posture. Use these guidelines to find reputable and skilled ethical hacker companies. These days, companies such as Trustwave SpiderLabs make it easy to purchase, prioritize, and regularly schedule these engagements with retainer agreements. The Last Word The time has come: it is necessary to adapt. But this is not such as bad thing. The best known attacks, Operation Aurora, Stuxnet, and Flame, were based on techniques and vulnerabilities that have been known in the industry for years. We have known that this day would come, but we don’t need to feel like the sky is falling just because it is here. Instead, businesses must learn, understand, strategize, implement, test, and update their information security architectures. Businesses are betting the house on a bad hand when they rely on outdated security technologies, ignore the human element, and fail to adapt to new threats. But that’s just me: I never gamble with my money—only with my life. ***** Double down with Industry Analyst Chris Rodriguez by e-mail. For additional information on hacking, check out the Frost & Sullivan white paper entitled The Importance of Ethical Hacking: Emerging Threats Emphasise the Need for Holistic Assessments or learn more about Network Security.
by Vikrant Gandhi 26 Aug 2012
| Add Comment
In a recently released Frost & Sullivan report, “Efforts of Independent M2M Service Providers and M2M MVNOs” was highlighted as one of the key drivers in the North American M2M communications market. While it is true that leading mobile operators have committed a significant amount of resources (platforms, processes, and people) for M2M, they may find it difficult to do justice to each and every emerging opportunity. For example, small-to-mid sized business (SMB) customers often struggle to get the full attention of large mobile operators, which is why M2M solution providers have been successful in providing remote connectivity services to the SMB segment. This is not to say that M2M solution providers are working exclusively with smaller organizations. In many cases, they have started to emerge as the preferred connectivity providers within the large enterprise segment. One such M2M solution provider is Aeris Communications (Aeris). Based on discussions with Aeris, the following key success factors for leading M2M solution providers were identified: Adopt a “Solutions Provider” approach instead of a “Services Provider” approach M2M customers no longer want to use services of a provider that sells airtime only. M2M is a complex market. It is appropriate to describe it as an amalgam of multiple technologies. Customers need a partner that can help them with all stages of their deployments – whether it is the choice of hardware module, application design, or integration with existing infrastructure. Additionally, professional services and operational support tools should be available to deliver a compelling customer experience. All of these areas need a partnership approach. Purpose Built Network Successful M2M solution providers have built a “dedicated” network for M2M. Ownership of network platforms – such as Home Location Register (HLR), Short Messaging Service Center (SMSC), Rating and Charging/Billing systems and others – can help M2M solution providers deliver a unique and customized service experience to customers. Additionally, any change requests (or modifications) into service parameters - such as changes to a rate plan or even the desired quality-of-service (QoS) may be easier to accommodate for such solution providers. Adding on M2M capabilities on top of existing consumer services framework may work fine in the short run, however, long-term success is clearly going to be a result of the M2M solution providers’ ability to help customers in a timely and effective manner (which comes from ownership of strategic network elements). Flexibility and Ease of Implementation M2M applications are unique and often vary widely in their requirements. The ability to accommodate new business models through customized pricing is absolutely critical in this industry. This does not mean that M2M solution providers should look to support hundreds of different pricing models. In fact, there are certain standard plans that can work well in most cases. However, the ability to create explicit plans that match different M2M usage patterns is an important element of any successful competitive strategy. In addition to basic usage rating, these plans need to recognize the unique supply chain needs of M2M customers. Customers require rating solutions that allow them to build their products, test them over the air, and distribute them without recurring fees. This allows an M2M customer to defer costs until there is offsetting revenue. Putting it all Together (Formula for Success) The key to success in M2M is to: a) have as much access (and control) to the network control channels as possible, b) understand the unique customer needs and the specific requirements of the M2M applications, and c) merging it all together to create innovative solutions. It is also about execution – customers want to get their product to market as soon as they can. Any or all processes that are part of M2M service delivery should be automated lest scaling should become an issue for the M2M solution providers. Leveraging best practices in application design and development is one way to achieve this objective.
by Jeanine Sterling 15 Aug 2012
| Add Comment
Earlier this year, Frost & Sullivan surveyed mobile and wireless purchase decision-makers in the North American business sector. Both Canadian and U.S. companies and organizations were included in this research. Their responses indicated a continuing strong attraction for employee-facing mobile applications. The mobile enterprise applications market is clearly on a growth trajectory, with at least 68% of the 2012 survey respondents planning to introduce one or more new mobile software solutions on their employees’ mobile handhelds (smartphones, tablets, basic cell phones, and/or ruggedized devices) during the next 12 months. At the time of the survey (May 2012), the majority of businesses (52%) already had from one to four mobile applications in place for their mobile workers. Another 17% of respondents had between five and ten applications deployed. An additional 13% of businesses had over ten mobile solutions in place, with two-thirds of these heavy users deploying an impressive twenty or more mobile handheld-based applications. In all, 82% of North American businesses (across all size segments) report having at least one mobile application deployed to their employees’ handheld devices. When asked to project the volume of new mobile worker software apps to be introduced during the upcoming twelve months (mid-2012 through mid-2013), 68% of businesses planned at least one new introduction. Another 9% of the total respondents expect to introduce more than ten new solutions over that twelve-month period. This level of mobile software deployment for workers is great news for application developers, mobility platform vendors, and their various channel partners. But it also raises a number of IT and LOB issues, including how to effectively manage the expanding array of solutions. Businesses must also carefully plan how to train employees on these apps. Given the level of cost sensitivity that surfaced with other survey questions, executives must also determine how to measure the business value and benefits of the mobility software. Additional survey results and charts are included in our upcoming study: “2012 Mobile Enterprise Applications: Opportunities Within Enterprises in North America,” NB68-65.
by Nancy Fabozzi 13 Aug 2012
| Add Comment
Health IT Needed to Transform Patient Care The provision of health services is an exceptionally data-intensive endeavor. Of course, other industries are also very data-intensive but the health care industry has notably lagged in making use of advanced information technology (IT) solutions to manage and process vast troves of data. Health care data includes clinical medical information collected at the point of care, financial information resulting from highly complex billing and claims processing, and voluminous administrative and demographic data required as a result of significant legal and compliance requirements--all of which provides a rich resource for gaining knowledge and insight into best practices. Unfortunately, in health care, disjointed data is collected across highly fragmented systems that are still often predominately paper-based or, if electronically-based, are not usually amenable to interoperability with electronic systems used by various payers, providers, or government agencies. The whole thing is mess and everyone knows it but the good news is that this situation is rapidly changing today with the new emphasis on all things health IT, particularly advanced EHRs and new methods of health information exchange. Over the past decade, especially during the last five years, health care providers, including hospitals, have considerably accelerated their use of clinical IT systems such as EHRs. Thanks in no small part to the strong push from the federal government in the form of financial incentives provided by the HITECH Act, hospitals are focused on installing new EHR systems and digitizing clinical information that has traditionally been locked in paper-based silos. The growing adoption of health IT is key to the government's goals to transform our health care system, as laid out in the ONC's Federal Health IT Strategic Plan for 2011-2015 and summarized in the following chart - Growing EHR Adoption Drives Analytics Evolution to Next Phase The growing use of EHRs is a very positive development on the road to health system transformation. In the next phase, newly digitized clinical data will be combined with financial and administrative data to yield new insights that will improve the quality, efficiency, and safety of patient care. Over the past 20 years or so, hospitals have steadily increased their knowledge and capabilities around digitally gathering and analyzing financial and administrative information. Thus, the digitization of financial and administrative data is further along in the hospital setting than is clinical data. Furthermore, the use of business analytics or business intelligence has been in place in most hospitals to some degree. Unfortunately, most hospitals have yet to adopt sophisticated analytic approaches to the data generated from their new EHR systems, and they particularly have not yet integrated clinical data with financial and administrative data. Installing EHRs so that clinical data can be digitized and shared is the first step towards transformed health care. As EHR adoption grows, hospitals will need to aggressively move towards new processes and strategies to leverage clinical, financial, and administrative health data for the benefit of individual patients, patient populations, and the nation as a whole. Health care reform, the move to accountable care, and the prospect of bundled payments, or value-based reimbursement, are key drivers for the need to derive advanced insights from all forms of digital health data. Gaining value and insight from health data requires advanced, cloud-based data analytics solutions that pulls in data from all sources to provide both real-time and predictive insights, unlike the traditional retrospective, business intelligence approach of the past that mostly focused on financial analysis. Health data analytics is really a whole new approach to the analytics process that will impact every aspect of hospital operations. It is clear that the future of health care will be increasingly driven by advanced health data analytics utilized at every point of care. We believe that the urgent need to transform our health care system will require hospitals to increasingly invest in advanced data analytics solutions to monitor end-to-end care delivery across a variety of settings as well as to provide comprehensive reporting on performance and quality measures to a variety of stakeholders. Understanding the key imperatives driving this phenomenon is essential. We take an in depth look at this issue in our new report, U.S. Hospital Health Data Analytics Market, 2011-2016: Growing EHR Adoption Fuels A New Era in Analytics, part of Frost & Sullivan’s Healthcare & Life Sciences IT Growth Partnership Service program.
by Vikrant Gandhi 06 Aug 2012
| Add Comment
Introduction The importance of cloud-based M2M platforms cannot be overstated. The "Internet of Things" (IOT) will be comprised of billions of connected devices across numerous verticals. Cellular wide-area networks (WAN) will play an important role in providing connectivity "either directly or indirectly" to a good portion of the devices that make up the IOT. The IOT itself will be designed as a Network of Networks. Various short- and long-range technologies have to co-exist in order to facilitate data communication between these connected devices and the enterprise backend. The current IOT environment is in a state of near chaotic change, with new hardware, interfaces, network access technologies, application protocols and technologies, and other individual components added or deleted quite regularly. This creates massive complexities for the M2M solution providers, who have to take all this fragmentation into account when designing, implementing, and upgrading their solutions. Adding large amount of connected devices will also result in significant amount of data ("Big Data") that needs to be stored, analyzed, reported, and archived. The current state of the industry is comparable to the early days of the Smartphone revolution, where hundreds of thousands of application developers are testing new ideas to develop the next big application for the connected world. However, many of these efforts run the risk of not achieving commercial fruition, largely because of the fact the early innovators do not have the required expertise, resources, or time to address the industry fragmentation. Unlike the Smartphone or other types of connected display devices that have standardized around a few main operating platforms, the M2M space continues to grow rapidly, with each vertical having its own set of platforms and pre-defined (and sometimes rigid) way of implementing M2M solutions. However, all M2M deployments send data either proactively or in response to a specific command to an application over a public or a private network. Exceptional growth in M2M will lead to exponential increase in the amount of computing power required to effectively manage all that transactional data coming in from connected devices. These are some of the reasons why the M2M cloud providers - that can provide the necessary resources to application developers to help them get started with M2M application development and deployments with relative ease - are expected to see significant adoption for their services. A well-designed and appropriately priced M2M cloud platform can certainly provide a cost-effective way to introduce new M2M applications to the market. In the past, M2M application providers have had little choice but to develop and manage the entire communication system on their own. This was a very costly affair, and clearly was not a very scalable approach. Cross-vertical communication between M2M applications (such as an electric vehicle and the Smart Grid) is important to help derive maximum benefits from the connected devices revolution. Security and compliance requirement are also evolving rapidly, and this places an additional burden on M2M solution providers who have to ensure that their deployments are in line with the key expectations of their customer and the regulatory authorities. In many cases, ongoing audits and certification requirements can place additional economic burden on M2M deployments, especially those that are custom built to serve the need of a one particular customer, industry, or a vertical. The application layer and the Infrastructure (or the platform) layer are the two main high-level components of M2M cloud platform deployments. M2M application developers should be able to focus on application development without having to spend too many resources on infrastructure. That is where the cloud-based M2M platforms come in. In many cases, M2M cloud platforms are designed to support hardware deployments of a particular module vendor (that may have a client agent pre-installed on M2M endpoints). However, in the long run, M2M cloud platform providers will have to extend their services to both in-house as well as to other industry participants’ modules by providing access to the necessary interfaces and on-device client software (if implemented as part of the connectivity architecture). Here is a profile of a company (Axeda) that has been a pioneer in M2M and has helped address many of the M2M application development and deployment challenges. About Axeda Axeda provides the leading cloud-based service and software for managing connected products and implementing innovative M2M applications — taking the cost and complexity out of connecting and remotely servicing the products of the world’s leading companies. Axeda customers use its M2M cloud service to deliver innovative M2M solutions and optimize their business processes with data from their connected products. By relying on the Axeda M2M Cloud Service to power their connected products, companies are literally transforming their business by improving customer satisfaction, reducing costs, and generating new sources of revenue. The M2M solutions behind these connected products range from remote service, fleet management, usage-based insurance, asset tracking, mHealth, and more. Key Offerings and Value Proposition The Axeda M2M Cloud Service provides an advanced cloud-based software for managing connected products and assets and implementing innovative M2M applications. The service provides companies with a secure and scalable M2M data integration and application development platform, connectivity over wired or wireless networks, and out-of-the-box device and asset management applications to reduce the cost and complexity of implementing M2M solutions. The Axeda M2M Cloud Service includes: Axeda® Platform, a secure and scalable data integration and application development platform with M2M application services and data management features for building and managing an M2M solution. Axeda M2M Connectivity, a family of software agents, libraries, toolkits and services for establishing connectivity between assets and the Axeda Platform, using a variety of communication methods (Internet, WiFi, satellite, and the AT&T network). Axeda Connected Product Management Applications, a suite of out-of-the-box web-based applications spanning remote service, remote access, and software and content distribution that enable companies to manage connected products and remotely identify, diagnose, and repair issues. Source: Axeda The Axeda Platform The Axeda Platform is a complete M2M data integration and application development platform with infrastructure delivered as a cloud-based service. With the highest levels of scalability and security as well as powerful development tools and flexible APIs, companies can quickly build and deliver custom M2M applications for the most demanding requirements and integrate M2M data into key enterprise applications and systems. The Axeda Platform includes: M2M Application Services – Allows developers to extend and customize the core platform functionality via a powerful embedded scripting engine and a rich set of Web Services for both SOAP and REST consumption. M2M Integration Framework – Accelerates integration with the Axeda Platform and enterprise systems with standards-based message queue technology. M2M Data Management – Processes and stores incoming M2M data; manages device and asset types, data items, locations, alarms, and files; and includes built-in security for managing users, roles, user groups, and device groups. Source: Axeda Axeda M2M Connectivity The Axeda M2M Cloud Service includes M2M connectivity services, software agents, and toolkits that enable companies to establish connectivity between their devices or assets and the Axeda Platform, while allowing them to choose the communication method and hardware that best suits their needs. As a result, companies can connect to any product using any device, over any communication channel (cellular networks, the Internet, WiFi, or satellite), for any application. Axeda M2M Connectivity Services includes three types of solutions depending on the class of device or asset that needs to be connected. Firewall-Friendly Agents - Software agents that run on Linux or Windows and install directly on assets or a networked gateway computer, connected to corporate assets. Wireless Agent Toolkits - A Java or ANSI C library for embedding Axeda connectivity into devices that can be compiled into a company’s own software and executed on a wide array of computing hardware and platforms. Device Protocol Adapter - A device communication server that connects to any M2M message protocol and can be extended with custom CODECs (coder/decoders) that translate the device’s native communication format into a form that the Axeda Platform can understand and process. Policy Server - A server-based software application residing on the customer’s network, Axeda Policy Server provides a comprehensive and granular set of permission settings that continuously governs Axeda Agent behavior for all devices at the customer location. Axeda Cloud Service As an advanced cloud-based software for managing connected products, the Axeda M2M Cloud Service is delivered as Software-as-a-Service (SaaS) via its ISO 27001: 2005-certified data centers, backed by complete security, scalability and infrastructure, and first-class operations and customer support. With Axeda’s on-demand service, enterprises enjoy all the benefits of the Axeda M2M Cloud Service without the challenges and overhead of administering and implementing the technology and infrastructure. Axeda’s on-demand service provides customers with a pay-as-you-go model that minimizes risk, and enables rapid deployment for faster ROI. Additional benefits include: Rapid and easy implementation - Reduce initial project implementation requirements as well as the struggle in gaining IT approval and ongoing support. Lower up-front costs and minimized risks - Take advantage of much lower upfront capital investment. An annual subscription fee enables companies to easily incorporate the on-going annual expense into planned budgets. Focused on the business, not IT infrastructure - Free organizations from supporting high-cost, time-consuming IT functions. These include purchasing, supporting, and maintaining the server infrastructure, equipment redundancy and housing, and the labor-intensive patch and upgrade process. Faster time-to-value - Reduce the time required to install licensed software and instead use Axeda’s first-class on-demand center that undergoes an annual SAS 70 examination to accelerate the solution deployment time. Greater end-customer acceptance - Enable end customers to preserve their own security policies and network protection. Security certified by a trusted third party, Axeda's cloud service gives the assurance needed to ease adoption of the solution by even the most security-focused customers. Enterprise security, availability, and scalability - Rely on Axeda’s secure and scalable infrastructure built on state-of-the-art hardware and software investments with its operational expertise. Ongoing management is performed by experts in the Axeda application, networking, security, hosting, data protection, and database administration. Custom application hosting – Companies can host the custom applications that they build and deploy on the Axeda Platform at Axeda’s on-demand center, simplifying the maintenance and administration of complete M2M solutions. Key Customers and Partnerships More than 150 of the world’s leading companies, including Abbott, Diebold, and EMC, rely on Axeda to get to market fast with their connected product solutions at the lowest total cost of ownership. A partial list of Axeda customers is available at http://www.axeda.com/community/customers/all/a
The Axeda Partner Ecosystem includes a wide range of device/module OEMs, Mobile Network Operators, System Integrators, Enterprise Application Providers, Logistics Solution Providers, and others. A partial list of Axeda’s partners is available at http://www.axeda.com/community/partners/find-partner
In January 2012 Axeda announced an exclusive multi-year U.S. reseller agreement with AT&T, which delivers the AT&T M2M Application Platform Powered by Axeda -- designed to help businesses and organizations get to market faster with innovative M2M applications. Final Word By providing end-to-end application development, connectivity, and service and support capabilities, Axeda has emerged as a major industry participant. There is tremendous opportunity in M2M, and companies such as Axeda will continue to see good traction for their offerings.
by Ben Ramirez 05 Aug 2012
Security research wizard Rodrigo Rubira Branco discussed how his Dissect.pe Project can bring value to organizations globally. At the beginning of my sit-in, I learned that the security researchers were all from Brazil. I have to admit that I love Brazil due to its festive culture and their cuisine. It should definitely be on your vacation bucket list of ‘must-dos’. But what you must also know is that it possesses some of the most dedicated, highly intelligent security researchers in the world. Let's take a briefing on where malware stands today. Malware has been a huge thorn for the corporate world these days. According to Panda Security, malware has hit a record high of about 26 million new strains found in 2011. This is quite concerning, considering that corporations and government agencies are trending to bring-your-own device (BYOD) laptops, tablets, and smartphones for their employees. This is because organizations must harden security strategies to protect these BYODs against sophisticated malware or otherwise face a possible catastrophic loss of valuable IP data. But detecting these malware payloads is one of the key challenging aspects security intelligence analysts must endure. While attentively listening to the A Scientific (But Non Academic) Study of How Malware Employs Anti-Debugging, Anti-Disassembly and Anti-Virtualization Technologies lecture, I was delivered some fascinating, yet daunting results in the challenge of combating evolving malware. Security researcher Rodrigo and his colleagues, Gabriel Negreira Barbosa and Pedro Drimel Neto, discussed how counter-detection methods were being used extensively in millions of malware samples which have been analyzed under their Dissect PE Project. This project is a scalable and flexible automated malware analyzer engine that provides feedback to the security community. Dissect PE allows the security researchers to use ‘plugins’ that can use any computer language (C, Perl, Python, etc) to output malware analysis. This project is being made public and allows security researchers, the media, and partners to share their malware code analysis within this portal on a global level. Some interesting facts that I discovered: There are 10 dedicated machines located in São Paulo, Bauru and Germany It receives 150+ gigabytes of malware samples per day A total of 30+ million unique malware samples have been discovered so far So what? So why is this important? We already have companies like FireEye, Bit9, Websense, Symantec, Cisco, McAfee and Damballa performing malware analysis. Well, as it turns out, the recent results that Dissect.pe outputted showed that at least four different Anti-Reverse Engineering (Anti-RE) techniques can detect and even compromise software/hardware processes in order to evade malware detection. Let me briefly provide a definition for each one, as the lecture mentioned: Anti-Debugging - Techniques to compromise debuggers and/or the debugging process Anti-Disassembly - Techniques to compromise disassemblers and/or the disassembling process Obfuscation - Techniques to make the signatures creation more difficult and the disassembled code harder to be analyzed by a professional Anti-VM - Techniques to detect and/or compromise virtual machines Even more interesting was the fact that Anti-Virtual Machine detection (Anti-VM) techniques were the most common category discovered. This is because security vendors are commonly using virtual boxes to capture and analyze malware. So the big question to ask is: Will executives who have purchased commercial security solutions still be awake at night because of insufficient malware detection power? My answer is yes: If security companies are still under the impression that their dedicated malware analyzers are doing an adequate job in determining the presence of malware, then we are in big trouble. This is because malware authors are continuing to rewrite malicious codebases to adapt to these detection systems daily. And with Rodrigo’s findings, it seems the security intelligence community now has a two-fold job in malware analysis: Determining if the threat is malware and if that same malware package is trying to avoid detection by using the four listed techniques! I have to be forthcoming; I was extremely pleased and ‘sold’ on this project. The real value from using Dissect PE portal serves as a sanity check and as a means to double-check other malware findings on other systems. But as Rodrigo mentioned, there is much more work to be done, and asks other security researchers in the world to include their findings and add their improved algorithms into the project in order to continuously counter these malware evasion techniques. Rodrigo comically stated that “Brazilians are not that lazy” and they “missed a lot of parties” due to analyzing these millions of samples. But this is the passion that is needed to defeat this never-ending battle against malware.
by Katherine Burns 03 Aug 2012
Every once in a while, you read something that really blows your hair back. Few things are better than having a moment of enlightenment – having a new idea presented to you that helps you make sense of your world in a better, simpler, way. That’s what happened to me when I came across a book from the 1960s entitled The Structure of Scientific Revolutions. This book’s author, Thomas Kuhn, argued that periodically practitioners of a shared discipline find that the framework (or paradigm) in which they operate has been undermined by a series of events that cannot be explained by the prevailing paradigm. If they continue to accumulate, these incidents combine to create a state of crisis. Out of the crisis and the chaos come a revolution and out of that an altogether new paradigm – a new way of looking at the world, a new framework for working, existing, and thinking. What an incredibly simple, but sophisticated, concept: a period of stability, followed by a period of chaos, followed by a new order of things. I know the phrase “paradigm shift” is no longer new, but that didn’t make it any less earth-shattering to me when I first came across it. This idea has deeply, fundamentally affected the way I look at things, and it certainly affected my approach to writing the growth process toolkit for technology strategy, which I have discussed in my two previous blogs. We are living through a paradigm shift right now – each of us trying to make sense of the chaos and searching for clues of what the new paradigm will be. Funnily enough, my own life is in the midst of something of a paradigm shift itself. I’m already a mother to a wonderful two-year-old boy, and I am about to have a daughter. My stable world will soon, to quote myself, find itself in a state of chaos. What will the new paradigm look like? And what is my daughter’s paradigm going to be? How will she look at the world? What truths will govern it? I’ve been thinking about this a lot lately – what do I want to teach her? What do I want her to like (hint: old movies, Esther Williams, sparkles)? What do I want her to not like (hint: scary movies, New York Giants, sugar)? What things do I really want her to believe? At the risk of falling into platitudes, I thought I’d share a few of my lessons with you all (no guffaws or eye-rolling allowed, I don’t care if it’s cyberspace). Fred Astaire made it look easy. The lesson: the harder you work, the more effortless it will seem. There is no substitute for hours upon hours of practice, frustration, setbacks, and breakthroughs. Talent alone is one step above laziness. Find your brilliance. My dad used to tell me, you’re probably not going to be a genius at everything. But you might be lucky enough to be a genius at one thing. Have the courage to run at that strength with everything you’ve got. Use your words. The English language is a wondrous thing. Treat it with respect. Learn your grammar. Diagram sentences. Speak properly. Write beautifully. Read E.B. White and P.G. Wodehouse. Listen to Cole Porter. Daddy’s wrong about Mommy’s movies. Just because it’s old doesn’t mean it’s outdated. I know there’s something to be said for special effects, but could any technology of today improve upon Gone With the Wind? I rest my case. The past is a treasure trove of awesomeness. Every day is a happy day. Start every day believing that it will be better than the one preceding it. Never think you’ve peaked. What was the line from Anne of Green Gables? Each new day is a new beginning, with “no mistakes in it.” Isn’t that comforting? Anyway, this is sort of my blogging swan song, at least until November. So with that, I leave you all with a few thoughts: 1) I just recorded a podcast revisiting the concept of technology strategy (its opportunities and risks, success stories and cautionary tales), and I’d love for you all to take a listen. Please forgive my voice – I’ve been battling laryngitis. My husband says I just talk too much. 2) If the toolkits seem interesting to you, take note: There are 10 of them! You can see them all here. 3) If you’re not a member of Growth Team Membership, and therefore can’t access these materials but would like to, let us know. 4) Enjoy the Olympics! Enjoy Halloween! I’ll see you at Thanksgiving! As always, happy computing. Katherine Burns Katherine is the Director of Strategic Communications for Growth Team Membership, a premier best practices research group within Frost & Sullivan. You can follow her on Twitter: @KatherineSBurns.