Blog archive - May 2012
Use the blog to discuss and comment on the latest industry insights provided by our analyst experts.
by Katherine Burns 30 May 2012
Well, it’s my first blog. Not just my first one for Frost & Sullivan – my first blog ever. As a professional writer, and professional communicator, I guess this means I’m somewhat old-fashioned. I’d rather write something longhand than type it, and I wish we’d all put the Postal Service back in business by sending each other some letters. (Remember letters? No “you’ve got mail” ding, dong, ping, or gong could ever be better than the silent anticipation of opening an envelope.) Anyway, I’m old-fashioned in most aspects of my life. I love old movies. I love old music. A few weeks ago my husband asked me to name a famous Grunge band, and the best I could do was Aerosmith (apparently they are NOT “Grunge”). Somehow this makes me eccentric, whereas the fact that he couldn’t tell me the difference between Ella Fitzgerald and Julie London just means he’s cool. Whatever. And so I’ve skipped through life, mostly paying homage to things that happened before I was born, and all to a Cole Porter soundtrack. I’ve watched Singing in the Rain more times than I could count; I’ve memorized all of Fred Astaire’s movies. I’ve devoured books on the Golden Age of Hollywood. Mid-20th Century detective stories are my vice (I’ve read them all, but Nero Wolfe’s brownstone is my absolute ideal…and if that doesn’t mean anything to you, do yourself a favor and pick up Some Buried Caesar, or maybe Champagne for One). Speaking of mid-20th Century detective stories: One of the lovely things about them is the way the detecting is done. There’s no scanning of Twitter pages, no research of Facebook posts. The hero might read back issues of the New York Times, or he might pay a trip to the library. He might even go really high-tech and type something, on a really snazzy machine like an Underwood. In all, a decidedly low-tech (but always successful) way of arriving at whodunit. Imagine my anxiety, then, when I was asked to write about technology. Not the technology of yesteryear—but the technology of tomorrow! Technology that hasn’t even happened yet, and how we can predict it and prepare for it! Did I mention I’ve barely started to blog? As I like to remind my boss, I have only to look at a computer to fry its insides past the point of hope or redemption. Perhaps I should explain why I was so chosen. I’m responsible for writing a series of Growth Team Membership deliverables called the Growth Process Toolkits. These toolkits are essentially primers – how-to-manuals—on key topics that drive a company’s top-line growth. For example, we’ve published toolkits on M&A, new product launch, distribution channel optimization, and more. We needed to write a toolkit on technology strategy, and as the author of the series, the responsibility fell to me. I was, as I said, somewhat hesitant to begin. What could I possibly teach on this subject, when I was so ill-informed myself? And then I realized two lovely things all at once: I’m not the only one who’s overwhelmed by the rapid, nearly disorienting pace of technology evolution today. It’s OK to acknowledge this feeling, and to empathize with others who may also be struggling to make sense of the chaos. That’s why I decided to open the toolkit with a quotation from historian Henry Adams (for you history buffs out there, Mr. Adams was a direct descent of John, who was his great-grandfather). I came across this passage while reading David McCullough’s wonderful new book The Greater Journey: “Every day opens new horizons and the rate we are going gets faster and faster till my head spins and I hang on to the straps and shut my eyes.” He wrote those words in 1900 – but how apt they seem today! Maybe not everyone reading this would self-describe as “old-fashioned,” the way I have, but I think everyone can relate to that sentiment and sometimes feels powerless to keep up with…well, anything today. We live in a crazy time. That’s probably why we need a toolkit on technology strategy in the first place. And that leads me to my second realization: The ideas don’t have to be mine; I just have to present them clearly. One of the great things about my job, and about writing these toolkits, is that it’s made easier by working with extremely smart people. I may not know a lot about technology, but I know who in our company does, and I know how to seek them out, ask them questions, and see how they’ve helped others think through technology-related challenges. All I have to do is collect the goods and translate them into a single, cohesive story. I might not be good at blogging (am I?), but I can certainly do that. And so there was no need for trepidation—and in fact, this was a good chance for me to learn about something that I’ve avoided, perhaps to my detriment, for a very long time. I’m happy to say that the production of this new toolkit has been a learning experience for me; I hope reading it will be one for you as well. That last sentence probably makes it sound like the toolkit is finished. It’s not. But it will be soon, and we’ll share it with you as soon as it is. Check back in with us next month, and we’ll provide some more detail on it (that, and my favorite detective stories). Until then, happy computing. Katherine Burns Katherine is the Director of Strategic Communications for Growth Team Membership, a premier best practices research group within Frost & Sullivan.
by Holly Lyke Ho Gland 29 May 2012
| Add Comment
Frost & Sullivan’s Growth Team Membership™ (GTM) recently completed its 2012 survey of R&D/innovation and product development executives throughout Europe. The executives were asked to identify their most pressing challenges for 2012. The survey reveals that R&D executives continue to struggle with doing effective portfolio planning and leveraging a wide network for idea generation. Moreover, respondents are challenged by how to generate an accurate technology map—outlining customer needs, available solutions, and technology gaps—to guide portfolio planning and project prioritization. The other prominent challenge is a perennial one, identifying the next breakthrough idea. To examine these challenges in more depth, the survey asked respondents to “root cause” their top challenges by indicating if they stem from issues with staffing, process, technology/systems, or strategic alignment. R&D executives attribute their challenges to two primary causes: limitations in staffing and processes. R&D executives are unlikely to see additional staff in 2012; most respondents expect staffing level to remain static. On a positive note, budgets are expected to increase in 2012. Despite the emphasis on breakthrough innovation, most of the budget increases will be allocated to short-, medium-term, and incremental innovation projects. In view of open innovation’s (OI) growing prominence and potential to help R&D develop emerging or disruptive technologies, the survey asked respondents about their use of OI. Surprisingly, given its prominence in the last two years’ survey results, the majority of respondents do not leverage OI in their product development processes. This may be attributed to respondents’ challenges with establishing partnerships and measuring the ROI of OI efforts. In regards to creating OI partnerships, respondents struggle with identifying partners with the right IP, establishing clear communication channels, and building sustainable trust. The R&D departments that are embracing OI tend to use it for idea generation and screening during product development life cycle and customers are their primary source of ideas. In terms of staffing for open innovation activities, most respondents employ part-time technology scouts. Open Innovation for Idea Generation - 2012 R&D Innovation Priorities Survey Results View more presentations from Frost & Sullivan
by Bisakha Praharaj 26 May 2012
| Add Comment
My work primarily revolves around Data Centers in India and what new is happening with them. Having said that, roughly since the past two years we have seen the entry of a new offering by Data Center providers in the form of pay-per-use services or IaaS. IaaS being the primary cloud offering, some providers have also ventured into other flavours of PaaS and SaaS services through a wide range of partnership and collaboration. What got me writing today about this topic, is the team discussions we have been having offlate over a new research report by the title- "Cloud Computing for SMBs in India". It is extremely intriguing to see the various industry segments interested in having a share of the cloud pie! We not only have Telcos, Carrier Neutral ISPs (traditional DC service providers who would lean towards IaaS) and software vendors (who are the pioneers for SaaS), but a wide segment encompassing pure-play cloud vendors, system integrators, software services players and virtualization vendors. Figure 1: Existing IaaS providers in India Above is a snippet of the IaaS market players in India. It is interesting to note the inclusion of Pure Play vendor Amazon showing an interest to operate out of this geography. As of now the closest DC hosting EC2 lies in Singapore which might be a concern to IT managers who have a stout preference to have hosters operating from India. To nullify this concern and other Government imposed data privacy regulations and over and above be an active player in this market, AWS is seen to be in talks with Tata Communications to co-locate at their facility and offer EC2, S3 and other services. HP and IBM launched their services in Beta a couple of months back and time will tell as to which segment Indian companies show a preference towards. Moving over to PaaS, another keen offering or seen as a next in line offering over IaaS, we do see various Indian and international players offering it. India being an IT development hub, this service would have a key buying area amongst the ISV community. As of now, we see Indian vendors of the like of Wolf Frameworks, OrangeScape, Ozonetel, Net4 & Netmagic (in partnership with OrangeScape) and Sify offering it. PaaS is the most amatuer flavour and is yet to penetrate the Indian market in a complete way expanding beyond serving just the ISV segment. Last and the most strong offering of SaaS, could actually have a complete book dedicated to it, let alone a blog post. But anyways, from a Data Center Services perspective, we see quite a few players of the like of Tata Communications, Reliance Communications, Trimax, Hughes, OneCloud (BSNL+ Dimension Data) who offer email, CRM and productivity suites as-a-service targetting the SMB community. For sure these are considered addressing a minority buyer's pool in comparison with game changers like SalesForce.com, Microsoft's SaaS products (Exchange, Office 365.com, SharePoint & Dynamics CRM Online), Zoho, Ramco and many more. Once you put all these flavours of cloud together and carry out a portfolio mapping exercise, it is really astounding to see the participation of various IT segments gearing up to serve the growing demand. Only time will tell which segment makes better inroads into the Indian market. Bisakha Praharaj Bisakha is a Senior Research Analyst with the Information & Communication Technologies Practice for Frost & Sullivan South Asia & Middle East region.
by Jannette Whippy 24 May 2012
| Add Comment
As an in-house designer who works alone, I require constant feedback. It’s the only way to know if I am on the right track. Recently, I had an epiphany: I have ignored my best design resource available, my fellow designer. I could have kicked myself for the oversight. I was working on a poster that was just not fitting together well. I ignored my first instinct to send it to a friend, and instead sent it to a fellow designer in a different department. Her insightful comments and suggestions helped me to see the holes in my design and the fixes we discussed made the poster better. Feedback is only as good as the reviewer. If your reviewer doesn’t know your intended audience or have much experience in your subject, their feedback (while interesting) is not as meaningful as another, more appropriate, reviewer. Take care in gathering feedback. Your work will get better if the feedback gathered is from someone who understands/is part of the audience you wish to engage. As Seth Godin says: “Shun the non-believers.”
by Austin Pullmann 23 May 2012
| Add Comment
Or rather, what is an example of a subject line you simply could not resist opening? I can think of a few – “How you can do what xx did”, or “Does this version work for you?”. Sometimes no subject line is the most effective of all. The airlines could stand to improve at this. I’m subscribed to perhaps every domestic airline’s email list, and the emails (judging by subject line) have virtually nothing to say. Subject lines consist of “Austin, check out these great offers”, or “Take advantage of our (insert month) sale”. The airlines must also get data from the same source on the best time/days to launch an email, since they tend to dump into my inbox at roughly the same time. In an effort to uptick open rates and increase email campaign effectiveness, here are 3 tips: Spike curiosity – This can be achieved in a number of ways. Introducing an incomplete thought that can only be completed by opening the email is very effective. For example, “Do you believe it?” virtually requires a reader to open it and learn more. Get to the point – Every word chosen either adds or detracts from the message. Limit subject lines to 50 characters or less, and ideally just 4-5 words maximum. The goal of the subject line is to get the reader to open the message. Once that’s been accomplished, the message itself can convey your objective. Don’t shoot yourself in the foot – There are certain terms that recipients are reluctant to open (such as “Free!”, or “Reminder”, or “Help”). Even worse, some terms are set to be blocked by spam filters. Choose wording that will grab the reader and entice them to open the message. Use these tips in your next email campaign and compare the results with previous campaigns. Post tips of your own if I'm missing anything critical!
by Holly Lyke Ho Gland 23 May 2012
| Add Comment
Frost & Sullivan’s Growth Team Membership™ (GTM) recently completed its 2012 survey of R&D/innovation and product development executives globally. The executives were asked to identify their most pressing challenges for 2012. GTM will focus its best practices research to address the prominent issues identified in the survey. This year’s survey indicates R&D executive struggle with two chronic challenges: (1) managing the product portfolio and (2) finding the next disruptive idea. In regards to portfolio management challenges, respondents struggle to develop accurate technology maps for planning, to prioritize innovation projects, and measure their portfolios’ success rate. Overcoming these challenges requires R&D executives to map out their portfolio strategy and develop key performance indicators to guide project prioritization and monitoring. Identifying the next breakthrough idea rarely involves a “Eureka” moment, but does require time and resources. While companies understand the need to develop emerging technologies, they are reluctant to commit substantial resources to high-risk projects. This risk aversion appears to be impacting R&D budget allocations—budgets for incremental innovations are increasing, while disruptive technology budgets remain stagnant. The survey asked respondents to “root cause” their top challenges by indicating if they stem from issues with staffing, process, technology/systems, or strategic alignment. R&D executives attribute their challenges to two primary causes: understaffing and processes (ineffective or nonexistent process). On a more positive note, R&D executives foresee additional resources—both staffing levels and budgets are expected to increase in 2012. Given the pressure on R&D executives to tap into new ideas and emerging technology, survey respondents were asked about their use of open innovation. The majority of respondents employ some form of open innovation team (OI)—typically a small, dedicated sub-group within R&D. It comes as no surprise that OI’s primary role in the product development life cycle is idea generation and that customers are the primary source of ideas. Even though companies are committed to using OI to increase their ability to develop emerging or disruptive technologies, respondents struggle with the fundamentals of establishing an OI process: securing internal buy-in, getting resources for idea testing, and creating a collaborative framework with external partners. Idea Generation and Portfolio Management - 2012 R&D Innovation and Production Development Priorities Survey Results View more presentations from Frost & Sullivan
by Holly Lyke Ho Gland 22 May 2012
| Add Comment
Frost & Sullivan’s Growth Team Membership™ (GTM) recently completed its 2012 survey of R&D/Innovation and product development executives in North and South America. The executives were asked to identify their most pressing challenges for 2012. GTM will focus its best practices research to address the prominent issues identified in the survey. According to the 2012 survey results, R&D executives continue to wrestle with portfolio management. Specifically, R&D executives need to prioritize innovation projects, balance the value and risk of the portfolio, and allocate budgets across a wide range of project categories. Respondents also struggle with two other persistent issues: (1) identifying breakthrough ideas and (2) integrating inputs from internal stakeholders (e.g., Sales and Marketing) with portfolio planning. The survey reveals differences in challenges between different business models. For example, the key challenges for R&D executives in B-to-B companies are generating technology roadmaps for portfolio planning and managing an open innovation process. In contrast, their peers in B-to-C companies are challenged by securing buy-in for promising innovations with senior management and streamlining the product development process to reduce costs. The survey asked respondents to “root cause” their top challenges by indicating if they stem from issues with staffing, process, technology/systems, or strategic alignment. By and large, R&D executives attribute their challenges to understaffing. Fortuitously, staffing and budgets are expected to increase in 2012. Ironically, though respondents stress the importance of driving breakthrough innovations, short-term or incremental projects account for the majority of the 2012 budget increase. Given the potential of open innovation (OI) to tap emerging technologies, survey respondents were asked about their use of OI. The majority of respondents (58%) apply OI approaches to their product development process. When asked about the role OI plays in product development, respondents report using OI for ideation generation and screening. However, respondents in B-to-C companies are more likely to use OI throughout the product development process than their B-to-B peers. The composition of respondents’ OI teams varies by business model. Respondents within B-to-B companies employ small, dedicated open innovation teams, while R&D executives in B-to-C companies use part-time technology scouts. While most R&D organizations engage in OI activities, R&D executives still struggle with implementation: establishing an effective OI process, garnering resources, and collaborating with partners. Respondents in B-to-B companies are focusing on developing a method to measure the ROI of OI activities, while their peers in B-to-C companies are endeavoring to establish a structured process to test the feasibility of idea submissions. From Portfolio Management to Open Innovation - 2012 R&D Innovation and Product Development Priorities Survey Results View more presentations from Frost & Sullivan
by Jake Wengroff 18 May 2012
| Add Comment
Some thoughts about Facebook now that it is a public company, and implications for the social networking industry.
by Roopam Jain 18 May 2012
| Add Comment
Disappointing Q1 results from market leaders Cisco and Polycom, coupled with a mellowed guidance for Q2, do not bode well for the videoconferencing markets. Revenues nose-dived in Q1 raising the big question - can videoconferencing continue the phenomenal climb it has seen over the last several quarters? Our recent research indicates that 2011 was a watershed year for videoconferencing. Revenues for endpoints and infrastructure systems surged to $2.8 billion in 2011, growing at 22 percent over prior year. Prices were stable and user demand remained strong throughout last year. Q1 2012 Videoconferencing Revenue Highlights Revenues grew 10% Y/Y with endpoints growing at 11.4% and infrastructure at 6.6%. Most vendors with the exception of Cisco reported a Y/Y revenue decline. Cisco was an aberration primarily due to its low sales in Q1 last year. Q1 2011 was a weak quarter for Cisco, largely due to channel issues and delayed orders resulting from transition of its ordering system after the Tandberg acquisition. Q/Q revenues declined by a whopping 22.6% from $830.7M in Q4 2011 to $642.9M in Q1 2012. Endpoints declined by 21.5%; infrastructure decline was 25.4%. This seems to be 2 to 3 times the seasonal dip normally seen in the first quarter of the year. Among the major vendors, the biggest Q/Q decline came from Cisco followed by Polycom, leading to a market share loss for Cisco in Q1. Radvision reported better than expected Q1 earnings and LifeSize held its revenues stable. Top Reasons for a Soft Q1 A confluence of several factors are coming into play in the videoconferencing markets. An uncertain economic climate – Concerns about the state of the global economy and cautious spending is putting a damper on videoconferencing growth. Vendors are reporting longer sales cycles and delays in the approval process. Furthermore, with the exception of travel reduction, the soft benefits of videoconferencing make it a hard sell to decision makers that are focused on the overall bottom line. Sales restructuring and channel issues – Major vendors have cited internal delays due to reorganization of sales teams as they move from selling point videoconferencing products to broader UC solutions. Polycom continues to show weakness in North America as it works on improving sales execution in the region. Impact of alternatives - Microsoft's increased marketing efforts around Lync being enterprise ready for voice in addition to its collaboration offerings has led customers to evaluate Lync as an alternative, leading to delays in the decision making process. Microsoft reported a 35% revenue growth in Q1 for Lync. Citrix Online reported a 29% revenue growth for its collaboration products with strong adoption for HDFaces video over PCs and mobile devices. We expect customers will continue to evaluate and adopt alternatives to traditional telepresence solutions to meet their collaboration needs. Pricing pressures - A significant slowdown in adoption of high end immersive telepresence suites as well as greater competitive pressures from vendors such as Huawei and Vidyo, that continue to nibble away market share, are leading to lower prices in the industry and impacting revenues. Seasonality and buying patters - Q1 has traditionally been the weakest quarter for videoconferencing industry due to the buying cycles. Following on the heels of strong purchasing in Q4, customers are focusing on roll outs during the early part of this year. 2010-2011 was when wallets began to open up and now we are starting to see those deployments settle in. We expect to see a buy- deploy- use- upgrade cycle. Despite the Q1 market performance we believe the long-term forecast for videoconferencing remains strong. Vendors are stepping up their game with a focus on stronger sales execution and investments in innovation. Video will continue to be top of mind as the next-gen mode of communications, with a bulk of the market remaining under penetrated. At the same time, it is now well understood that the face of videoconferencing is changing at a fast pace. End user discussions have shifted away from being exclusively focused on high-end telepresence to now include video to desktops and mobile devices and being able to do so economically. The emergence of tablets as a videoconferencing device and the impending shift to software-based solutions coupled with the impact of cloud-based delivery will open up new opportunities and at the same time upend established business models.
by Mohamed Alaa Saayed 17 May 2012
| Add Comment
Polycom sold its wireless endpoint business to Sun Capital (an affiliate of Sun Capital Partners Inc.) for approximately USD 110 Million. Many see this decision as wise and logical, allowing the company to focus on its core video and UC business, as well as improve its image as a software vendor rather than a hardware company. However, I happen to see things differently. 1-Why leave a growing market? The Voice over Wireless Local Area Network (VoWLAN) single-mode and Digital Enhanced Cordless Telecommunications (DECT) phone market is on a healthy growth path. In 2010, the world enterprise DECT and VoWLAN Single-mode phone markets experienced positive growth rates of 26.7 percent and 35.2 percent, respectively, in terms of units shipped. While growth rates have been higher in this particular year due to pent-up demand throughout the economic downturn, projections for both market are very optimistic. Over the 2011-2017 period, the market for VoWLAN single-mode devices is expected to incur a healthy CAGR of 35.3 percent, while the more mature DECT market grows at a CAGR of 12.2 percent, both in terms of units shipped. 2-:Why abandon a leadership position? Polycom is among the top five vendors of both VoWLAN single-mode devices and enterprise DECT phones.Polycom’s combined growth rates for both product lines were way above the average for the industry. Before the decision to sell its wireless unit, Polycom held some of the largest partner relationships in the market. Revenue from its OEM shipments was growing and was a major part of the company’s wireless strategy. OEM partners included Avaya, Alcatel-Lucent, and NEC. Polycom also worked with Siemens, Mitel, and several other partners that functioned as resellers. The divestiture of these product lines does not only eliminate a solid revenue stream for Polycom, but it also takes away the opportunity to further penetrate these partners’ broader UC&C suites with wireless and other endpoints 3-Why lose an entry point into some key verticals and geographic regions? With its wireless portfolio, Polycom had gained considerable traction in specific vertical industries such as healthcare, retail, hospitality, and manufacturing. Although it continues to serve these verticals with its IP desktop phones and video/UC solutions, stripping the wireless endpoint business could curtail different opportunities within verticals that still favor ruggedized and high-quality wireless endpoints to other types of mobile devices. Also, in the case of Polycom's Kirk line, specifically, the vendor gained considerable worldwide presence in more than 40 countries were the solution was sold. Europe, specifically, has been the main battleground for DECT technology. Through different DECT deals, the vendor could have been able to use this as a window to promote all its other voice and UC-related technologies in a regional market that has always been key to communications vendors. I'm sure Polycom had good reasons to sell its wireless division. However, the on-site mobility market is still offering very good growth opportunities that will be now leveraged by Sun Capital. We wish both Polycom and Sun Capital the best of luck with their new strategies.
by Michael Brandenburg 17 May 2012
| Add Comment
Sometimes it takes a fire hose event like a trade show to remind those of us that follow the UC&C space where all of these wonderful productivity tools (VoIP, video and web conferencing) live: the enterprise network. All of those rich media streams of audio, video, and data content ride along the same network that carries everything else that keeps the business running. If the network isn’t up to the task, the best communications and collaboration tools will fall down with it. That is why is it is heartening to see renewed interest and new energy going into the network, which featured prominently at Interop 2012 Las Vegas. Software-Defined Networking The biggest buzz at the show was around the concept of software-defined network (SDN). The premise of SDN is simple: separating the data plane, the parts of the network switch that move packets from point A to point B, and the control plane, the supervisory components that define what the makeup of that network. These two planes today exist as dedicated hardware within the modern network switch. In software-defined networks the data plane in either hardware or virtual switches becomes subservient not to its on-board supervisor, but to an external controller application running on physical or virtual servers. In the case of OpenFlow, a small piece of software, an agent, is embedded within the firmware of the network switch, directing it to take commands from the external application. In theory, software-defined networks make a lot of sense. The controller server is not bound to a single network switch to supervise, yet it can deliver command and control capabilities across every network switch in the enterprise. The controller software can be just another rack of physical servers or run as multiple virtual machine instances to provide redundancy and high availability capabilities. This architecture can be significantly less costly than paying for redundant hardware supervisors in every single network switch. With a high level of visibility, the SDN controller can direct individual network ports, no matter where on the enterprise network they physically exist to join together as a logical switch, presenting itself like a local switch to the servers and devices. And because it is driven by software, all of the network designs and configurations can be changed instantly and dynamically, with or without human intervention. These capabilities can make your unified communications applications appear like they are running side-by-side on the same local network. SDN can make it happen, even if the individual applications are in separate server racks or even in disparate data centers. Move the applications around in the virtual environment and an SDN will track those movements and automatically adjust the network to maintain the logical network, both on-premises and, in theory, cloud-based infrastructure. Better yet, more sophisticated applications could actually request changes to be made to the network on the fly. A call controller, for example, could programmatically adjust the QoS settings to all ports or shut down access to other applications, to ensure that an emergency call goes through. However, theory and practical implementations are two separate things. The Open Networking Foundation (ONF) is leading the charge to standardize how network equipment and controller applications communicate and interact with each other, based on the OpenFlow protocol. The ONF has a number of the networking vendors in the fold and they are working to deliver OpenFlow to the market. However, some of the biggest networking vendors have accepted the premise of SDN, but have not yet embraced the specific OpenFlow standards. Instead they are finding their own ways to deliver on the promise of software-defined networking. In other words, the networking market is responding to SDN, at least initially, like it has with any other disruptive shift in networking technology—proprietary if they can, adopt standards if they must. The software-defined networking concept is a dynamic, ever-malleable enterprise network. It is ultimately the shape of things to come and an ideal response to the challenges that enterprises face in the age of server virtualization and the convergence of data, unified communications, and storage on a single IP network. Network administrators and their command line configurations simply cannot keep up with this dynamic environment, and their networks will have to become as virtual and automated as the applications that are running on top of it. It remains to seen whether OpenFlow, proprietary standards, or something else brings us to this panacea, but one thing is clear: Everyone in the networking market knows it has to get there. The network still matters and it is time for it to catch up with the technologies that are developing around it.
by Holly Lyke Ho Gland 16 May 2012
| Add Comment
The Growth Team Membership™ (GTM) program recently surveyed marketing executives to identify their principal challenges for 2012. The survey found that marketers struggle to (1) cultivate a differentiated value proposition that resonates with clients, and (2) ensure Sales adopts the appropriate messaging and materials. Survey respondents indicate that understaffing and a lack of strategic alignment across Sales and Marketing’s leadership are the primary causes of Marketing’s struggles. By joining forces with Sales, Marketing can address the strategic alignment issue and tap into additional staff. However, successful collaboration requires taking a closer look at the following three areas: 1. Goal Alignment—Since revitalizing the value proposition requires a substantial commitment in time and resources, it is essential that Marketing and Sales agree on the reason for the revitalization at the outset. To achieve this goal, marketing and sales executives need to build consensus on answers to the following questions: Are our assumptions about how we are perceived by our customers accurate? Can we quantitatively prove any of our assumptions? How does our value proposition differentiate us from the competition? Does our current messaging tell the story we want? How consistently is our messaging being used? 2. Sales Involvement—No matter how necessary, or how compelling, the redesigned value proposition may be, the sales force may still resist it. Successful marketers understand that sales reps want some measure of control over the way they communicate with their customers and are prone to resenting outside influence. Marketers therefore involve the sales force throughout each stage of the new value proposition’s development (including messaging creation). This inclusion builds cross-functional ownership of the new messaging and limits the likelihood that Sales will reject it later on. Furthermore, it speeds new messaging roll-out, since the sales force will already be familiar with the value proposition and how to tailor its message for various segments. 3. Continuous Engagement—Trust and transparency are crucial to any long-term successful collaboration between Sales and Marketing. One way to maintain this openness is through regularly scheduled meetings between senior management in Sales and Marketing. Growth Team Membership researchers have found that a monthly cadence works best for keeping the conversation flowing, collecting feedback, and addressing collaboration challenges. Monthly meetings also allow marketers to track value proposition adoption and identify opportunities for improvement. While sales and marketing collaboration is a perennial challenge, some companies have found ways to unite these often-at-odds functions. Take the experience of Kronos, a workforce management software solutions company. For many years, Kronos considered itself a market leader, in spite of its flat product revenue growth. This disconnect stimulated Marketing and Sales to collaboratively revise and differentiate Kronos’ value proposition and messaging. Kronos’ sales and marketing teams followed the practices outlined above—alignment, inclusion, and engagement—to overcome key barriers to collaboration. Successful collaboration has resulted in 92% of the sales force consistently using the messaging. Kronos’ new value proposition has also led to a 36% increase in its earnings before interest, taxes, and amortization (EBITA). In conclusion, Growth Team Membership survey data suggest that marketers are committed to differentiating their companies through redesigned value propositions. However, Marketing’s efforts are constricted by a lack of coordination and buy-in from Sales. By including Sales in the development process, Marketing can ensure its efforts are adopted and strengthen its relationship with Sales for long-term success. Want to learn more about best practices for sales and marketing integration? Sales & Marketing: Revitalizing the Value Proposition View more presentations from Frost & Sullivan