Software is eating the world and it’s a no brainer that GE decided to take the plunge into software development.Late in 2011 GE announced that it would invest $1B in software over three years as it seeks to use software to make its products more profitable. One could argue that this was a late move for GE, after all IBM recognized the power and importance of software in the nineties and over the past three decades has invested heavily in software and services while divesting some of its hardware assets.
On the heels of its software initiative, GE launched its Industrial Internet initiative, which is the convergence of the global industrial system with advanced computing, analytics, low-cost sensing and connectivity via the Internet. To make this a successful initiative GE needs key enabling technologies such as cloud, big data and analytics to come together so that it can make sense of vast amount of data that will be created by the machines. And on that note, earlier this week GEannounced that it would invest $105 million for a 10% stake in EMC and VMware’s newest venture – Pivotal.
Pivotal inherited assets and people from VMware and EMC with a mission to build a new platform for a new era. The platform will consist of Cloud Foundry for its Cloud Fabric, Spring and vFabric for its Application Fabric and Pivotal HD, GemFire and GPDB for its Data Fabric. While individually these assets may be successful as Pivotal is expected to bring in $300 million in revenue by the end of 2013, over all integration remains a key challenge and will take a long time for Pivotal to address. Still GE has faith in the management’s ability to execute.
Will Pivotal enable GE to compete against other companies in the IoT space? Will the Pivotal One Platform enable GE to build the services it needs for its Industrial Internet initiative? Netflix didn’t choose CloudFoundry and other Pivotal assets as they didn’t meet its needs and instead built its own platform. But unfortunately for GE, IBM has a head start against GE having launched the smarter planet initiative in 2008 and showing 25% YoY growth in the most recent quarter. So rather than building its own platform, GE made the right move to focus on its core competencies and partner with someone who can build the platform for it. It certainly used a rather unorthodox model of partnership by taking a sizable investment stake in Pivotal vs. simply buying the product, thus having a strong ability to influence the product direction. Perhaps GE’s investment will pay off…
This post was originally posted on cloudfieldnotes, a Rishidot site
In his April 2013 letter to shareholders, Jeff Bezos talks about how in 2012, AWS announced 159 new features and services. No other cloud service provider even comes close. There are several things that make Amazon the undisputed leader in cloud services, but the biggest driver behind Amazon‘s competitive advantage is its customer centric culture rather than competition centric
culture. Bezos talks about how their customer focus enables them to invent, improve, lower prices and increase value for customers before they have to thereby earning trust with customers. It is a letter worth reading.
We all know at the back of our minds that culture is important, yet often we focus on things that don’t really matter. Culture comes from people, not size. Yet often startups turn away candidates who have worked at larger companies forgetting that Amazon employs 88,400 people and is more agile and innovative than many startups. Culture comes from people, not deep pockets. Yet often free meals, drinks and expensive social outings are advertised as reasons to work for a company forgetting that people may not work well together even though you may have organized a team outing on a beach in Hawaii.
Building the right culture is hard work. You get it right and you attract talent. You get it wrong and people perceive working for your company is committing career suicide and you end up fighting a talent war. John Willis recently gave a talk at the Silicon Valley DevOps meetup group and highlighted three very interesting cultures – Netflix, Github and Etsy. His research can be found here. While I believe no other company’s culture can be a match for your own, each of these companies provides examples of techniques that can be adopted for positive impact. For e.g. Etsy’s blameless post- mortem technique is very relevant in this cloud era. Failures happen and what better way to deal with them than to view them with a perspective of learning rather than punishing those individuals closest to the failures. Another interesting technique that both Netflix and Github talk about is using minimal processes, which is also relevant in today’s fast paced cloud era as businesses need to be agile or else be left behind. Github focuses on automation to reduce human processes while Netflix focuses on hiring high performance people who would use self-discipline and avoid chaos.
Lou Gerstner, former chairman of the board and chief executive officer of IBM once said, “Culture isn’t just one aspect of the game, it is the game. In the end, an organization is nothing more than the collective capacity of its people to create value.” So it comes down to who’s got the right people and that’s the horse you want to bet on.
Take a look at crowdsourcing platform Kickstarter’s top 5 funded projects in technology to find the things interest that the crowd. Not surprisingly, Internet of Things / Home Automation is one of the popular categories. There is pent up demand as people have had the desire to have more things in their home connected but there haven’t been very many products that are affordable or easy to use. But now we have key enabling factors so that supply can finally meet demand. We have cheap sensors, readily available networks and smartphones. We have cloud services that allow us to add intelligence to everyday things in our world without needing processing and data storage to be part of the things themselves. These cloud services form the basis for communication, data storage and analytics. And we don’t need to be millionaires to take advantage of these solutions.
Earlier this year I wrote a post about key attributes an IoT solution must have. SmartThings, one of the top 5 projects on Kickstarter, really gets them. The company emerged from a need to get notified about temperature drops and flooding and has since developed solutions around monitoring, controlling, automating the home. What is really interesting is the open platform that SmartThings has built, whether it is supporting the wireless standard of your choice or a variety of third party devices. They also allow developers to write apps that work inside the main mobile application because after all, who wants a separate mobile app per use case or device. And in order to attract developers, they have a developer contest with substantial prize money something that Netflix is doing today too with its Netflix Open Source software. An excellent way to quickly expand your solution, as time is money and good developers are hard to come by these days.
The real power of connected devices is in the apps that link objects together. For e.g. linking weather information from the cloud with your lights so that you can have the lights turn on when the sun goes down. SmartThings can let you build such rules into their rule engine using data from say the Weather Underground cloud or Belkin Wemo can do this using the IFTTT service. In the future, things will be able to learn our behavior and proactively automate things for us. For e.g. if the first thing you do after waking up is to turn on your coffee pot, perhaps your coffee pot could ask you if you wanted to set up a rule such that when you wake up the coffee pot can automatically start making your coffee while you brush your teeth. No magic needed, just intelligent things connected to each other. Possibilities are endless.
You know that Internet of Things that everyone is talking about these days? Kevin Ashton is that one that started it all. I had a chance to catch up with Kevin earlier this month and we discussed why IoT is taking off now, the rends, the challenges and what excites him about IoT.
To begin with Kevin talked about how IoT is really things that aren’t computers or smartphones that are connected to the network and communicating. It is your home communicating its temperature or your car tire communicating that it needs air or the warehouse communicating its inventory or health monitoring systems connected to person communicating data to a caregiver. Kevin pointed out that the big change we are seeing at the moment is the attention IoT is getting rather than adoption. However, the stage has been set to make IoT adoption happen more quickly now compared to 1999 when Kevin first started talking about it. The electronics we need for sensing things have become cheaper and do not need as much power as before. We have readily available WiFi networks to transmit the data from the sensors to the cloud as consumer WiFi adoption has grown rapidly in the last 5 years. Cloud services themselves cost less and are more advanced. And we now have very high-powered smartphones that allow us to not only interface wirelessly with sensors but also connect to the cloud to view and interact with the data. We have also become more sophisticated at machine learning technologies that help us recognize patterns in big data. Another enabling trend is that it has become easier than ever for smart people to make things in their garage. Kickstarter exists because there are very smart people with very good ideas and they can execute at least till the prototype stage with very little money. All these things combined create an environment where you can start thinking about creating low cost distributed sensor networks. So while the spotlight may move away, the work will continue and we are on the verge of something very special. And startups are the ones that will lead the way. Every technology revolution creates a few new huge companies and disrupts old huge companies. The revolution is driven by some people who have the ability to understand things and persuade other people to make changes. So it comes down to which company has the talent and that’s where exciting things really start to happen.
Who can benefit from IoT? Any industry that has anything physical can benefit from it, which is pretty much every industry. Kevin asserts that this is a revolution that will either benefit you or your competition if you ignore it and then hurt you. One may argue that abstract industries like finance may not benefit from IoT, but even the finance industry has equipment and people. However, industries that are more likely to adopt it first are the ones that have the most physical assets and information about these assets is hard to come by right now e.g. manufacturing, retail, health care, transportation, building management, construction, energy, water, natural gas
A trend that Kevin sees in this space is the move towards low powered wireless communications. A not so popular belief that he holds based on his experience with the WeMo line of products that he manages is that WiFi and not Zigbee or Z-Wave will be the network of choice, primarily because of its ubiquity. Technologies that improve battery life or technologies that harvest power from another source e.g. vibrations is another area of research. But beyond technology, he sees a trend towards modularity, open API, open interfaces, where everything works with everything else and users are not locked into anything. Apart from avoiding lock-in ad hoc modularity is the only way we are going to build a system with the scale needed for IoT.
An area that Kevin finds exciting is connected LED lighting. Every light bulb will be a networked LED light bulb and you will be able to control your lighting from something other than a switch on the wall. You could control lighting through the phone, turn lights on and off without getting out of bed or have lights in your home turned on when you are nearing home from work. Since a light bulb is now a networked it can also communicate information about the home and forms a stepping-stone for a low powered network that can sense things like temperature, ambient light and motion. A light bulb is certainly a good location to start building a distributed sensor network! Energy consumption and water consumption are other areas of interest to Kevin. Homes in California are streaming information about their water and electricity consumption in real time, but not just how much water or electricity they are consuming. These homes track fine-grained consumption e.g. how much water is used in the shower or how much electricity is used by the refrigerator. Leaks can be identified and devices using a lot of power can be turned off when not in use. This kind of tracking can not only help people use less of these precious resources but can also help us learn more about human behavior. In fact Alex Laskey of Opower showed that psychology of saving energy is driven by social pressure rather than cost savings or the need to do the right thing
IoT does face some challenges. Kevin believes that some standardization needs to happen but we must be careful about relying too heavily on new standards to emerge. The new generation of engineers already has all the standards it needs – TCP/IP, 802.11, REST, SOAP, XML etc. The area where we need the most innovation right now is power budget for devices because things are neither going to be plugged in nor can we expect people to change batteries often. Another area where are facing a challenge right now is skills shortage in machine learning. Data has no value unless an automated system can get information that is hiding in the data, make connections that are meaningful and share them with an automated system or with a human being. That kind of analytics is not trivial. So if you have kids studying computer science, encourage them to explore machine learning and analytics.
At its annual Directions conference on March 5th, IDC talked about the shift that the ICT industry is making to the 3rd platform which is the ecosystem driven by mobile, social, cloud and big data. The shift is occurring from the Lan/Internet, Client Server and PC era, the 2nd platform, which was preceded by the Mainframe Terminal era, the 1st platform. While the shift itself is not surprising, the interesting fact is that the 3rd platform is where 90% of growth opportunity is over the next 7 years. But buyers are looking for solutions vs pure technology and so it becomes important for vendors to expand value they bring to the table from silos to mashups. Its not just about cloud or mobile or big data or social anymore. It’s about a combination of these into something meaningful that improves the customer experience. To build the right experience, vendors should try to design their products for consumer and then enhance them for the enterprise vs enterprise first/only policy and remember that mobility goes beyond smartphones and tablets as our lives are being proliferated by connected everyday devices ranging from cameras to cars to toothbrushes.
IDC believes that applications for the 3rd platform are going to be developed on and will live in PaaS solutions. This view however does not match what Forrester’s Q3 2012 Global Cloud Developer Survey found, which is that 71% of Cloud Developers use IaaS and specifically Amazon to deliver applications. IDC does have an interesting view, one I tend to agree with. The next generation PaaS solutions will be Industry focused vertical platforms such as FinQCloud , Euronext, BaseSpace, Panoptix. These next generation vertical platforms will become powerhouses of valuable data and CIOs are going to gravitate towards this data. Thus, it becomes increasingly important for vendors to understand Data Gravity, a concept first described by Dave McCrory.
The 3rd Platform is powerful as it enables a new buyer of IT products, the LOB executive. This LOB executive is no longer dependent on internal IT to deliver what is needed. He/She is looking for offerings that can be easily consumed via a subscription model. Ultimately IDC recommends that we should prepare for the death of dedicated IT and embrace shared models to be successful. But what traditional IT organizations really need to do is to evolve to service brokers in order to support these new decision makers and worry less about in house vs off- premise as business needs will determine where IT lives.
Think Amazon and you think disruptive innovation. Amazon introduced a simple infrastructure service at low cost that was initially attractive to SMBs and now to, well everyone. It is the market share leader in the IaaS space and no one even comes close to it. And it hasn’t stood still, introducing new services while regularly lowering price. In fact, it continues to innovate disruptively beyond pure infrastructure. Think AWS Redshift.
Flipping the coin and looking at VMware, we see another market share leader who dominates the server virtualization space. But the difference is that compared to Amazon, VMware is standing still. In fact, I think it took a step backwards by spinning off big data and cloud application platforms assets and deciding to focus in its core server virtualization business. Some may disagree. Yes its core business is doing well, but it is also facing increased competition from Microsoft Hyper-V, and Xen and KVM open source rivals. Over 60% of workloads are already virtualized. License revenue growth declined from 31% YoY to 13% YoY in 2012 and cloud management software business is only bringing in less than 20% of total license revenue. Stock price has suffered and is down 24% since the beginning of the year. Perhaps pressure is getting to VMware executives that they lashed out at Amazon, warning partners that if they let workloads go to Amazon, they lose. The comments that CEO Pat Gelsinger and President and COO Carl Eschenbach made at VMware Partner Exchange conference are shocking to say the least and there has been much discussion about it. Forrester’s James Staten wrote an excellent post on it. What is even more shocking is that when Carl Eschenbach referred to Amazon as just a company that sells books, he drew applause from partners. These partners also think that Amazon is just for test and development environments. Classic case of groupthink or as James Staten put it, the blind leading the blind? They need to wake up and realize that AWS revenue is estimated to be 82% of VMware’s total revenue in 2012. Amazon neither just a bookseller nor is AWS used just for dev/test environments. There is a reason why AWS outages are so highly publicized and scrutinized – several companies depend on it for critical operations.
Richard Saintvilus on The Motely Fool suggested that Oracle should come to VMware’s rescue. I don’t know about you, but the thought of buying software from VMwaracle makes me picture a jail cell. VMware needs to rethink its strategy of “owning the corporate workload now and forever” and understanding what customers really want.
At Netflix, even developer meetups have a movie like experience … The first Netflix OSS meetup was held on Wednesday at the Netflix HQ in their theatre room with popcorn, sodas! The energy in the room was fantastic as people came in to learn more about the Netflix projects that it has open sourced starting with the Curator project in 2011 and its 2013 plans.
I must say that Netflix is among the few companies that have the right idea, that its platform is the business enabler while its content is its competitive advantage. By open sourcing its platform and inviting others to contribute, it is able to focus its big investment dollars on Netflix unique services and at the same time leverage industry experience and best practices to keep moving the platform forward. In fact, Netflix has done such a great job with developer relations that developers keep contributing to the open source projects even after developers leave the company. While there weren’t many in the room that were contributing to the projects, Netflix is hoping that the picture changes over time.
Such is the interest in Netflix tools that within 24 hours of announcing the next meet up, over 150 people have already signed up. However not many people are going to be able to use Netflix OSS to develop cloud applications end to end. As it stands today, Netflix OSS is a collection of components. Companies would require engineering talent to put the pieces together and not many do. Even if they could put the pieces together the Achilles heel for Netflix platform is its reliance on Amazon. It has been burnt by Amazon, with its most recent Christmas Eve outage being very visible.
Netflix clearly understands these issues. It wants to make its platform easy to adopt and work towards building a platform ecosystem. It also wants to eliminate AWS as its single point of failure and add portability and availability to the platform. Its 2013 roadmap highlights build and deploy, recipes (sample applications), availability, analytics and persistence as key categories. The Netflix OSS overview and roadmap can be found here and the lightning talks about the various projects can be found here. It will be interesting to see how the platform evolves and whether scalable, feature rich public cloud alternatives to AWS emerge to make the platform truly portable.
In 1999, Kevin Ashton coined the term “The Internet of Things” as the title of a presentation he made at Proctor and Gamble where he linked the idea of RFID to the Internet. “If we had computers that knew everything there was to know about things—using data they gathered without any help from us—we would be able to track and count everything, and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling, and whether they were fresh or past their best. The Internet of Things has the potential to change the world, just as the Internet did. Maybe even more so”. Come 2013, several analysts have predicted that this will be the year Internet of Things (IoT) finally takes off. Sensors are cheap; cloud makes compute and storage available on demand and big-data analytics solutions are maturing. But for IoT to be successful, the following things need to happen…
The things connected to the internet need to provide value
Because it is possible to connect a device to the Internet, should it? Does my rice cooker to be networked? Not really, and besides I certainly wouldn’t pay $600 so that I can send recipes to it or know how much electricity it has used. The things that are part of IoT need to provide a valuable service at a price point that enables adoption or be part of a larger system that does.
A rich ecosystem needs to emerge
IoT is comprised of the things, sensors, communication systems, servers, storage, analytics and end user services. Developers, network operators, hardware manufacturers, software providers need to come together to make it work. The right partnerships need to be developed so that the right functionality can be easily available to the customers in the marketplace.
Systems need to provide APIs
The beauty of APIs is that they allow users to take advantage of systems suited to their needs on devices of their choice. They also allow developers to innovate and create something new and interesting leveraging using the system’s data and services, ultimately driving the system’s usage and adoption.
Developers need to be attracted
Now that everything is going to be development platform we need developers who can program to them. Developers are going to be in huge demand, but they are also going to need to use a lot of different tools to develop solutions that work across different device platforms. And vendors need to find a way to attract developers to their platforms. Why would a developer develop to the Ford API vs GM API if they own neither car?
Security needs to be built in
Things previously cut off from the digital world will be newly connected to it thus exposing them to new attacks and challenges. Things from Pacemakers to Insulin Pumps to Home Automation Systems have been shown to be vulnerable to attacks. As we rush to connect everything to the Internet, we are exposing ourselves to financial attacks, geo-political attacks or even attacks just for fun. Why not design systems with security built in from the ground up rather than worry about it later?
Gartner predicts that by 2020 over 30 billion devices will be connected to the internet and is one of the top 10 strategic technology trends for 2013. Yet a ZDnet survey rates an immature market as the top reason why businesses are not using the technology. IoT does have the potential to generate operational benefits and new revenue and is already being used in several industries such as health care, IT, energy, transpiration, facility management, manufacturing, retail and consumer. But will IoT become the Internet of Everything? Time will tell.
Some say that the networking industry is going through the biggest transformation energized by software defined networking. A new crop of startups has emerged and existing networking vendors are rolling out SDN enabled products or snapping up startups lest they be left behind. Juniper most recently acquired a 2-day-old startup – Contrail systems. A gold rush really! People are picking up networking skills as evidenced by the fact that tens of thousands of people around the world signed up for the free “Introduction to Networking” course offered by Stanford this fall taught by Philip Levis and Nick McKeown. Dear server, step aside, its time for the network to be the sexy one in the datacenter!
Developing features in software lowers barrier to entry compared to developing integrated systems that are common in the current networking landscape. From a vendor perspective, high margins of companies like Cisco are attracting several players to the market. A Jeff Bezos aphorism “Your margin is my opportunity” is certainly apt here. From a buyer perspective, for could service providers or companies that operate global data centers e.g. Google or Verizon, SDN offers the ability to drive high utilization rates by treating geographically disparate resources as a unified pool. These companies find that unlike compute and storage technologies, traditionally networking spend hasn’t gone down with scale, in fact they require more expensive equipment. Management spend is also higher due to the need to manually configure individual elements or to develop complex automated configuration to deal with non-standard vendor configuration APIs. SDN enables them to potentially buy merchant silicon and pair it with networking smarts on an x86 server. Software based smarts also allows them to introduce services faster and to be able to modify these services as the need arises, increasing business agility allowing them to focus on their competitive advantage.
But when we take the rose colored glasses off we can see that SDN is still in the process of being defined. And buyers are skeptical about the need for and capabilities of SDN as noted by Lori MacVittie who recently attended Gartner DC conference.
Of course the hesitation is also partly due to resistance to change, after all there are thousands of professionals specializing in the current state of things of network administration. Still at the moment, SDN is not everyone’s cup of tea. Large companies that have the staff to dedicate to implement SDN based technologies can be early adopters since vendors are unlikely to make lives easy for buyers anytime soon … they are unlikely to completely open up control interfaces or standardize on API that can be used to manage the network devices in a multi-vendor data center. There is also a clear need for education and clarity in the market. One of my favorite quotes to express this was stated recently by Doug Gourlay, vice president of marketing, service provider and federal sales at Arista Networks – “Some people have good strategies and others are really good at PowerPoint”
The $1.26B acquisition of Nicira kicked off a gold rush in the networking industry where xisting networking vendors are flocking the market to buy startups so that they can sell SDN based products. There will be some innovation and a lot of “SDN” washing after all a transition from hardware driven business to a software driven business does not happen overnight… but this is the beginning. Exciting times lay ahead for networking, the datacenter and cloud.
The morning after thanksgiving the family gathered around the breakfast table and we had our coffee cups in one had and our tablets or smartphones in the other. We got to talking about how these devices have changed our lives. While we are distracted by them, mobile computing has also raised our productivity. The conversation lent itself to the inevitable question – what’s next? We talked about how connected devices, with intelligent machine-to- machine communication powered by advanced analytics and social media will be commonplace in our future. But what impact will that have on our lives and the economy as a whole?
Before we dive into that, lets look at what has come before and what its impact has been. If we look at the worldwide GDP, sustained growth is a modern phenomenon.
Sustained growth didn’t occur until the 1800s and was triggered by the industrial revolution. Digging deeper into growth, production can be explained by the cobbs-douglas equation
y = A*k3*n7 where:
y = total production (the monetary value of all goods produced in a year)
n = labor input (the total number of person-hours worked in a year) k = capital input (the monetary worth of all machinery, equipment, and buildings)
A = total factor productivity (TFP)
Thus growth can be explained by the following equation
Δy/y = ΔA/A +0.3 * Δk/k + 0.7 * Δn/n
The above equation shows that both capital and labor have a diminishing marginal productivity and without TFP growth, sustained growth is not possible. Higher TFP implies that the economy is more efficient, producing more output with the same labor and capital inputs. In this post industrial society is there anything that can trigger growth?
Technologies that enable productivity and efficiency gains fuel economic growth. Connected devices, with intelligent machine-to- machine communication powered by advanced analytics and social media can be growth enablers. If we connect devices with sensors, enable them to communicate with each other and leverage advanced analytics software, predictive algorithms, automation and social media, we can achieve very interesting results.
Sensors connected to a system can transmit its health to analytics software in the cloud that leverages predictive analytics to detect system failure and schedules maintenance, all without human intervention, thus increasing efficiencies. Imagine a self-healing data center. The data center senses that its cooling system performance has degraded and within two hours the system will no longer be able to maintain acceptable temperatures. The data center kicks off the disaster recovery program and activates the backup data center. It also identifies the problem within the cooling system and schedules maintenance. The data center then uses social media to alert other systems that subscribe to its updates about the failure and the change in primary data center. These systems can leverage this information to determine if they need to run health checks and identify faults before hand and correct them. On a much smaller scale imagine that you are on vacation and your water heater breaks down. Your home detects the failure and alerts you on your phone, but along with the failure report, accesses social media to identify recommendations for replacements for you, where you can order them from and how you can schedule maintenance.
Systems become self-healing and self learning system with greater efficiencies than before. Cloud, big data, analytics, mobile and social become key enabling technologies. IBM with its smarter planet initiative and GE with its Industrial Internet initiative are doing some very cool things in this space. What about you? Do you have an interesting project you are working on? I would love to hear about it.