Accenture Acquires Cloud Sherpas To Enhance Cloud Consulting Chops, Especially Around Salesforce.com
Accenture, the worldwide consulting company, announced today it was buying Cloud Sherpas, a firm that specializes in helping companies incorporate cloud services like Salesforce.com, Google and ServiceNow into the enterprise.
Accenture did not disclose the purchase price.
It’s not a coincidence that this announcement came as Dreamforce, the enormous Salesforce.com customer conference, is taking place in San Francisco this week. The acquisition actually has a lot do with Salesforce consulting services.
Accenture already has a team of more than 2700 consultants devoted to helping with Salesforce integration. They will be adding 500 additional Salesforce experts with the Cloud Sherpas deal.
“The context is Accenture has been driving a cloud first agenda in response to clients focusing on the cloud increasingly as a platform to fuel transformation,” Saideep Raj, global managing director at Accenture told TechCrunch. The deal certainly helps expand the Salesforce consulting team, but it’s more than that, Raj said.
Cloud Sherpas also brings experience with Google as Google’s largest consulting partner and ServiceNow, a company that Accenture is seeing embedded in an increasing number of enterprise processes where service is a key component.
Cloud Sherpas, has been around since 2007 and has grown into a worldwide consultancy with over 1100 employees, who will now be part of Accenture. As soon as the deal closes. They will join the newly created Accenture Cloud First Applications team.
If cloud computing is supposed to simplify computing, you may wonder why it would require a consulting team to help implement cloud solutions, and that’s a legitimate question. Companies moving to the cloud have lots of issues around digitization and transformation, and working with existing legacy applications alongside cloud applications. There are also issues of more complex custom integrations with a product as sophisticated as Salesforce.
That’s where Cloud Sherpas comes into play. While Accenture has also been helping companies make this move to the cloud, even before the acquisition, this gives them a huge team of experienced consultants to expand that consulting unit with one stroke of a pen on a check.
Cloud Sherpas formed at a time when the idea of Software as a Service in the enterprise was just beginning to develop as a mainstream concept. While Salesforce.com launched in 1999, many of today’s biggest cloud companies weren’t even around at that point. It was a company well ahead of the market need in that regard.
“We saw several things including demand from users of technology, not just as it relates to corporate transactions, but enabling the user experiences around mobile technology. The cloud was uniquely suited to this and we thought it was going to take off and resonate with users,” David Northington, Cloud Sherpas CEO said.
Over the years, the cloud services have gotten better, and the projects have grown increasingly sophisticated. Northington says that as part of Accenture, that should only accelerate.
Today, there is all kinds of complexity in spite of the cloud notion of simplicity. If you’re integrating enterprise cloud service into an existing enterprise stack, it sometimes takes help. For instance, companies working with Salesforce Wave, the company’s analytics platform might need help connecting to the various data sources and create the kinds of custom reports a company might need.
But it’s more than helping implement a single service, it’s about stitching together a range of services from a single vendor like Salesforce or across services, and that’s where this combination could really shine, Raj explained.
Cloud Sherpas has raised over $63 million, according to Crunchbase.
Titan Graph Database Integration with DynamoDB: World-class Performance, Availability, and Scale for New Workloads
Today, we are releasing a plugin that allows customers to use the Titan graph engine with Amazon DynamoDB as the backend storage layer. It opens up the possibility to enjoy the value that graph databases bring to relationship-centric use cases, without worrying about managing the underlying storage.
The importance of relationships
Relationships are a fundamental aspect of both the physical and virtual worlds. Modern applications need to quickly navigate connections in the physical world of people, cities, and public transit stations as well as the virtual world of search terms, social posts, and genetic code, for example. Developers need efficient methods to store, traverse, and query these relationships. Social media apps navigate relationships between friends, photos, videos, pages, and followers. In supply chain management, connections between airports, warehouses, and retail aisles are critical for cost and time optimization. Similarly, relationships are essential in many other use cases such as financial modeling, risk analysis, genome research, search, gaming, and others. Traditionally, these connections have been stored in relational databases, with each object type requiring its own table. When using relational databases, traversing relationships requires expensive table JOIN operations, causing significantly increased latency as table size and query complexity grow.
Enter graph databases
Graph databases belong to the NoSQL family, and are optimized for storing and traversing relationships. A graph consists of vertices, edges, and associated properties. Each vertex contains a list of properties and edges, which represent the relationships to other vertices. This structure is optimized for fast relationship query and traversal, without requiring expensive table JOIN operations.
In this way, graphs can scale to billions of vertices and edges, while allowing efficient queries and traversal of any subset of the graph with consistent low latency that doesn’t grow proportionally to the overall graph size. This is an important benefit for many use cases that involve accessing and traversing small subsets of a large graph. A concrete example is generating a product recommendation based on purchase interests of a user’s friends, where the relevant social connections are a small subset of the total network. Another example is for tracking inventory in a vast logistics system, where only a subset of its locations is relevant for a specific item. For us at Amazon, the challenge of tracking inventory at massive scale is not just theoretical, but very real.
Graph databases at Amazon
Like many AWS innovations, the desire to build a solution for a scalable graph database came from Amazon’s retail business. Amazon runs one of the largest fulfillment networks in the world, and we need to optimize our systems to quickly and accurately track the movement of vast amounts of inventory. This requires a database that can quickly traverse the logistics history for a given item or order. Graph databases are ideal for the task, since they make it easy to store and retrieve each item’s logistics history.
Our criteria for choosing the right graph engine were:
- The ability to support a graph containing billions of vertices and edges.
- The ability to scale with the accelerating pace of new items added to the catalog, and new objects and locations in the company’s expanding fulfillment network.
After evaluating different technologies, we decided to use Titan, a distributed graph database engine optimized for creating and querying large graphs. Titan has a pluggable storage architecture, using existing NoSQL databases as underlying storage for the graph data. While the Titan-based solution worked well for our needs, the team quickly found itself having to devote an increasing amount of time to provisioning, managing, and scaling the database cluster behind Titan, instead of focusing on their original task of optimizing the fulfillment inventory tracking.
Thus, the idea was born for a robust, highly available, and scalable backend solution that wouldn’t require the burden of managing a massive storage layer. As I wrote in the past, I believe DynamoDB is a natural choice for such needs, providing developers flexibility and minimal operational overhead without compromising scale, availability, durability, or performance. Making use of Titan’s flexible architecture, we created a plugin that uses DynamoDB as the storage backend for Titan. The combination of Titan with DynamoDB is now powering Amazon’s fulfillment network, with a multi-terabyte dataset.
Sharing it with you
Today, we are happy to bring the result of this effort to customers by releasing the DynamoDB Storage Backend for Titan plugin on GitHub. The plugin provides a flexible data model for each Titan backend table, allowing developers to optimize for simplicity (single-item model) or scalability (multi-item model).
The single-item model uses a single DynamoDB item to store edges and properties of a vertex. In DynamoDB, the vertex ID is stored as the hash key of an item, vertex property and edge identifiers are attribute names, and the vertex property values and edge property values are stored in the respective attribute values. While the single-item data model is simpler, due to DynamoDB’s 400 KB item size limit, you should only use it for graphs with fairly low vertex degree and small number properties per vertex.
For graphs with higher vertex degrees, the multi-item model uses multiple DynamoDB items to store properties and edges of a single vertex. In the multiple-item data model, the vertex ID remains the DynamoDB hash key, but unlike the single-item model, each column becomes the range key in its own item. Each column value is stored in its own attribute. While requiring more writes to initially load the graph, the multiple-item model allows you to store large graphs without limiting vertex degree.
Amazon’s need for a hassle-free, scalable Titan solution is not unique. Many of our customers told us they have used Titan as a scalable graph solution, but setting up and managing the underlying storage are time-consuming chores. Several of them participated in a preview program for the plugin and are excited to offload their graph storage management to AWS. Brian Sweatt, Technical Advisor at AdAgility, explained:
“At AdAgility, we store data pertaining to advertisers and publishers, as well as transactional data about customers who view and interact with our offers. The relationships between these stakeholders lend themselves naturally to a graph database, and we plan to leverage our experience with Titan and Groovy for our next-generation ad targeting platform. Amazon’s integration between Titan and DynamoDB will allow us to do that without spending time on setting up and managing the storage cluster, a no brainer for an agile, fast-growing startup.”
Another customer says that AWS makes it easier to analyze large graphs of data and relationships within the data. According to Tom Soderstrom, Chief Technology Officer at NASA’s Jet Propulsion Laboratory:
“We have begun to leverage graph databases extensively at JPL and running deep machine learning on these. The open sourced plugin for Titan over DynamoDB will help us expand our use cases to larger data sets, while enjoying the power of cloud computing in a fully managed NoSQL database. It is exciting to see AWS integrate DynamoDB with open sourced projects like Elasticsearch and Titan, while open sourcing the integrations.”
Bringing it all together
When building applications that are centered on relationships (such as social networks or master data management) or auxiliary relationship-focused use cases for existing applications (such as a recommendation engine for matching players in a game or fraud detection for a payment system), a graph database is an intuitive and effective way to achieve fast performance at scale, and should be on your database options shortlist. With this launch of the DynamoDB storage backend for Titan, you no longer need to worry about managing the storage layer for your Titan graphs, making it easy to manage even very large graphs like the ones we have here at Amazon. I am excited to hear how you are leveraging graph databases for your applications. Please share your thoughts in the comment section below.
Geoffrey Moore coined the term “systems of engagement” to describe IT systems that support multiway communication and collaboration between businesses and customers. These are distinct from “systems of record,” or those IT systems (e.g., databases and management information systems) designed primarily for one-way, read access of structured data. In today’s highly competitive global markets, digital systems of engagement are an absolute necessity for enhanced employee productivity, partnership success, customer satisfaction and brand loyalty ̶ all of which result in revenue growth. In fact, a Deloitte Digital survey found that by the end of the year, digital interactions would influence 64 cents of every dollar spent in retail stores.
In its global survey for IBM, “Systems Of Engagement Demand New Integration Solutions — And A New IT,” Forrester Research reports how customer expectations of their business interactions are changing dramatically. What customers want most are easier interactions, the ability to deal with them via their smartphones and consistent treatment across all channels. And as any successful business knows, the customer is always right.
Companies that embrace SMAC (social, mobile, analytics and cloud) will be best positioned to deliver systems of engagement that meet these users’ expectations. For example, users want to engage with retailers, banks, and any other personal or business services via any device and any channel, from mobile apps on their smartphones, to tablets, social media and cloud-based e-commerce websites. Wearables are also increasing in popularity, with the Workforce Institute at Kronos Inc. reporting that 73% of online adults see “safety and wellness” as one potential wearable-workplace benefit.
Perhaps most important, users expect customized interactions, in which big data and analytics deliver the intelligence required to enhance user experiences and drive new revenue. For example, by targeting customers with personalized offers and suggesting relevant purchases, retailers leverage valuable customer data to encourage new sales. And the multiway interaction continues after the purchase. Customers are engaged in providing retailers with product feedback and chatter on social media about product quality, which the retailer then analyzes to gain insights for encouraging return business. Add the Internet of Things (IoT), and soon organizations will be engaging with cars, appliances and other purchased items, spurring new sources of service revenue that have the potential to last for years.
From the Center to the Edge
Moving to systems of engagement requires an Interconnection Oriented Architecture™ that can deliver a satisfying, real-time experience for any user, anywhere, on any device, via any engagement channel.
In an age of digital images, voice and video, more than fat pipes are required ̶ low latency is vital. Low latency comes from proximity, not only to the user, but also among the applications, data repositories, cloud services and other elements that drive the engagement experience.
That’s why successful enterprises are moving from a centralized interconnection architecture, with multiple long-distance MPLS, Internet virtual private networks (VPNs), and other connections emanating from one or two corporate data centers, to a more distributed interconnection architecture that harnesses existing, globally dispersed interconnection/colocation data centers (such as Equinix) that house multiple cloud services, network providers and partner ecosystems.
An interconnected enterprise leverages proximate, direct, high-speed interconnections among clouds, partners and other IT delivery systems for fast, low-latency interactions, bringing all of these services closer to dispersed global users. Rather than building multiple connections one by one, organizations can simply extend their network to the nearest interconnection access point.
With systems of engagement and data pushed to the edge, close to mobile users, the entire user experience on any device is transformed. Countless organizations have benefited from this transformative architecture, including these Equinix customers:
- A major multinational banking and financial services firm reduced latency by 45% by strategically deploying its banking applications in multiple global interconnection centers.
- A health care Software-as-a-Service (SaaS) provider achieved real-time customer service interactions by leveraging a globally dispersed interconnection architecture to securely deliver on-demand software updates.
The future is engagement, which means you need to get up close and interconnect.
In a recent interview, CEO of DeNA West Shintaro Asako made some interesting comments regarding the future of their partnership with Nintendo. In that, they expect a lot of people to play these games.
While the interview is pretty up front and Asako’s answers are direct, I will be honest in that it does come across a bit exploitative in tone. Admittedly, this is where DeNA are currently in terms of their approach to mobile gaming monetisation. However, as this is a partnership I had hoped that Nintendo’s line of thinking would have become more part of the planning by now.
In any case, Asako made two very interesting comments, the first being that, “Hundreds of millions of people have bought Nintendo consoles. Those are people who decided to spend a minimum of $200 just to get access to Nintendo IP. That number is already twice as big as the Candy Crush total user base. Not only that, every single person buying Nintendo devices spends an average of about $100 per year on software. So I have no question that when Nintendo’s mobile games come out, at least 150 or 200 million people will try it. These people are super core Nintendo fans who used to spending $150 to $250 just to access the content.”
In short, that means even if you only have a small number of those spending a small amount of money per month then that adds up to big money on a recurring basis. What with Nintendo’s fanbase as it is it’s likely that they will pay up so long as it is handled properly.
This is the key point though, going down the full on exploitative freemium route here would be disastrous and critically damage Nintendo’s brand. However, freemium faces a huge problem within its business model as it relies on users called “whales”.
It’s been 385 years to the month since Gov. John Winthrop founded and named “Boston,” a hilly peninsula that had been settled earlier in 1630 by his fellow Puritans. This week, that historic U.S. city becomes a destination for people looking for insight into the cloud, a technology that’s reshaping the history of the digital world.
As its title suggests, the conference will focus on the cloud’s impact on business, with its opening session specifically looking at how new technologies – including software-defined networking and the Internet of Things – are redefining the cloud marketplace. As we gear up for the Boston conference, our mind is focused on how these technologies will be impacting what companies will be doing in the cloud.
The current and future impact of cloud on the enterprise is well-documented, including in the recent report by Oxford Economics, “The Cloud Grows Up.” The report indicated that nearly 70% of businesses surveyed plan to make “moderate-to-heavy” cloud investments over the next three years. Other findings with implications for the cloud marketplace:
- 61% of survey respondents expected their companies to have developed new products or services via the cloud within three years, up from 26% that had done so.
- 51% expected to have developed new lines of business via the cloud in three years, compared to 28% that had done so.
- 50% expected to have entered new markets in three years, compared to 40% that had done so.
Given the pace of cloud change in the past three years, the cloud marketplace three years from now could look nothing like what we can conceive today. But the two technologies singled out by conference organizers will play a huge role in getting us from now to whatever’s next.
Software-Defined Networking (SDN): SDN’s huge advantage is that it frees the enterprise to shift network control from physical network devices it has to “touch” and maintain (i.e. switches, routers) onto software applications that allow centralized, end-to-end network provisioning and visibility. SDN is at the heart of the kind of cloud services and infrastructures that are becoming less a feature of this interconnected era and more a necessity to operate in it.
That’s because business is changing to require instant, simultaneous interconnection between cloud partners to reach increasingly mobile end users at anytime, anywhere, on any device. That kind of agility can’t happen without a dynamic, software-based architecture. Equinix already offers this capability with our Cloud Exchange, which at its core, uses SDN within the Equinix Programmable Network.
Internet of Things (IoT): A lot of the excitement about the IoT centers around the new business insights it can offer. Until now, we’ve never had sensors on deep sea ships, for instance, spilling out information about weather changes, ocean currents, cargo conditions, so we’ve never had a chance to see the innovations and efficiencies that information might lead to. IBM midmarket business general manager John Mason calls data “the new natural resource.” But this resource can’t exist without the cloud, and it can’t be mined without the cloud, so the growing significance of the IoT in the cloud marketplace is clear – for businesses of every size.
“I think eventually every business has to have somewhere in its portfolio and go-to-market approach a range of cloud services,” Mason told Forbes.
At Equinix, we agree with the folks in Boston that “cloud means business.” Click the link to learn more about how our cloud infrastructure solutions can help the enterprise do business.
When asked which Value-Added Reseller (VAR) is most likely to win their enterprises’ business for a significant hosting project, the majority said IBM IBM +1.36% (18%) followed by Microsoft MSFT +0.00% (11%), Amazon (8%) and Dell (7%).
Database (57%), e-mail (54%) and business applications (ERP, CRM & industry-specific apps) (49%) are the three leading application hosting investments enterprises will be making in the next two years.
These and other insights are from Beyond Infrastructure: Cloud 2.0 Signifies New Opportunities for Cloud Service Providers (66 pp., no opt-in) a report providing valuable insights into the Managed Service Provider (MSP) and Cloud Service Provider (CSP CSPI +0.00%) landscape. The study was conducted by 451 Research LLC and commissioned by Microsoft. Despite being vendor-sponsored, this study provides several useful take-aways on the broader MSP and CSP marketplace. Please see the first pages of the study for more on the methodology.
Key take-aways include the following:
- Enterprises’ highest expectations when moving to hosted services or cloud computing is gaining improved technology quality on platforms and applications (22%), helping to grow the business (18%), improved availability and better business service (13%). The following graphic analyzes the expectations enterprises have when moving to the cloud.
- Within two years, 34% of enterprises will have 60% or more of their applications on a cloud platform. 47% of marketing departments will have 60% or more of their applications on a cloud platform in two years. These and other insights are from the graphic, Future Percent of Applications in Cloud by Department.
- 63% of all enterprises surveyed are running an on-premises private cloud with a hosted private cloud. 45% using an on-premises private cloud with a public cloud. 32% have a hosted private cloud integrated to a public cloud. The following graphic shows the relative level of interoperability maturity across the three hybrid cloud scenarios 451 Research evaluated. MSPs and CSPs need to excel at mastering cloud interoperability and integration technologies to keep pace with the wide variety of enterprise needs in this area now and in the future.
- Security (35%), better control by IT teams (19%) and improved performance (17%) are the three most common factors driving enterprises to move workloads from public to private clouds. It’s interesting to note than only 14% of respondents see workload migrations from public to private clouds being cost driven and only 7% are seeing the shift due to IT centralization plans.
- Backup and Recovery (68%), Disaster Recovery/Site Recovery (54%), Application development tools & platforms (47%) and Mobile Services (47%) are the four Managed Services enterprises anticipate investing the most in in the next two years. The following graphic provides an overview of forecasted spending on Managed Services over the next two years.
- Database (57%), e-mail (54%) and business applications (ERP, CRM & industry-specific apps) (49%) are the three leading application hosting investments enterprises will be making in the next two years. Analytics including Business Intelligence (BI), data mining and Big Data (41%) is the sixth-most mentioned area for future investment. The following graphic provides an overview of Application Hosting investment priorities over the next two years.
- When asked which Value-Added Reseller (VAR) is most likely to win their company’s business for a significant hosting project, the majority said IBM (18%). Microsoft (11%), Amazon (8%) and Dell (7%) were mentioned second through fourth. When asked which system integrator (SI) would most likely win a significant hosting project, IBM was mentioned most often (25%) followed by Microsoft (13%). Please see the second graphic for the top-mentioned SIs.
- CIO/CTOs are the most influential C-level executives regarding purchasing decisions related to cloud-based Application Development and IT Operations/Administration apps. CEOs are the most influential regarding cloud-based marketing app adoption. The following two graphs illustrate the stakeholder decision making authority roles of C-level executives.
Winweb’s cloud-based business management solutions have a new focus on industry-specific requirements, with special applications for retail, wholesale, services, manufacturing, healthcare, financial services, non-profit organizations, membership businesses, and franchises.
WinWeb industry solutions offer basic business functions, such as accounting, invoicing and business planning as well as industry-specific functionality, for example multi-location and multi-channel sales for retail.
Read more about Winweb industry solutions at WinWeb.com.