2020: Retail as a Service?

Digital disruption is typically seen as a form of “waterfall innovation” – where a new entrant unseats legacy players by adopting a radical new approach to service delivery using technology (like Amazon leveraging its own cloud based e-commerce platform capabilities to beat incumbent Retailers on convenience and price). Yet a challenge to this view is that such disruptors are actually applying a form of “agile innovation”, where through incrementally developing their own live services they gradually transform and re-shape a market – a detailed look at Amazon finds its approach to customer service improvement is not disruptive but iterative; where over the last ten to fifteen years it’s used its own net revenues for R&D activities (not for short-term profit) to continually drive massive grow.

The implication is that a Retailer can exploit the competitive advantages of digital disruption by using an iterative service delivery approach – so what could be the benefits and challenges of this “Retail As A Service” model? Here are some ideas…

OpEx Funded Innovation – A major blocker to Retailers investing in digital transformation is that it can involve significant upfront capital expenditure to deliver a return in investment that is difficult to forecast and realise. Applying an “as a service” approach, an alternative could be to deliver small, incremental improvements using a portion of Retailer’s margin earned during the same financial year. No big financial risks, the Retailer can only invest what it earns from the market with the added benefit that such OpEx funded innovation can rapidly pivot to changing customer demands. Yet any slicing of margin will impact a Retailer’s profitability – its owners or shareholders would need to tolerate a different form of financial risk to make this approach acceptable; reduced, variable short-term profit for potential significant long-term gains.

Zero Physical Asset Operating Model – Could the application of a service-based approach to delivery be extended beyond the traditional areas of IT and back office transformation into other parts of a Retailer’s operating model? For example, a Retailer could run a “zero physical asset” business; where front-end services like stores, supply chain management, even sales staff resources are provisioned on a pay-as-you-go basis. A key benefit would be that the Retailer doesn’t run the risk of owning fixed term assets like property or technology that may become commercially unviable or obsolete. However, this would create new risks – a key one being that the Retailer becomes wholly dependent on other service providers’ availability and ability to innovate to meet its competitive needs.

If you would like more information about how digital transformation can benefit your retail business, leave a reply below or contact me by email.

Bob Dylan was right about Digital Transformation

Bob Dylan is recognised as one of most influential writers of the 20th century.  He is not though, seen as an inspiration for the digital age. Perhaps he should be? With his 1964 song, “It’s Alright, Ma (I’m Only Bleeding)”, he states that “He not busy being born is busy dying”. With this line he couldn’t have been more prescient.

Organisations need to continually “be busy being born” and innovate or face the alternative.

Think about it: what differentiates companies hobbling along the digital highway from the ones paving the way? The ability to embrace change, refuse status quo and turn the business into an ever evolving entity. 

Put another way, being digital is about reconciling the pace of adoption of new technologies with the pace of their commoditisation. The latter recently experienced a dramatic acceleration, while the former is often stuck in old-world mode.

 

Old world versus Digital world

Twenty years ago, adopting new software was a big deal. There were limited numbers of vendors in each market segment. Software customisation or process transformation was necessary to take advantage of technology. Integration was complex, ongoing maintenance and support often presented challenges. All of this resulted in expensive acquisition costs, from both a financial and an effort perspective. Long-term supplier contracts were the norm.

Once software was installed, and the vendor proven, it was a lot easier for an organisation to allow the vendor to expand its footprint through additional modules and products rather than go back to the market to look for alternative solutions.

From a vendor perspective, selling and delivering software was costly requiring a large sales team to reach customers and negotiate complex contracts. Vendor delivery teams would need to be highly skilled building bespoke integrations to satisfy the specific needs of customers.

New software integration was expensive, risky and therefore needed careful consideration. Adoption pace was slow as software was seen as complex far from being a commodity.

Today, the pace of commodization has increased by an order of magnitude, mainly due to Cloud technologies. Let’s have a look: what does innovation mean today in the enterprise world? Big data maybe, machine learning and AI, blockchain or IoT?. All these have already been turned into commodities. Fancy running some machine learning algorithms on your customers database? AWS has an API for that. Conducting a first run shouldn’t take more than a few hours of work. Same goes for most of big data technologies, IoT, blockchain and even virtual reality.

as-a-Service paradigm

The as-a-Service paradigm has drastically reduced costs, complexity and risks of adopting new software and technologies. The SaaS model for instance through turning CAPEX into OPEX, has abolished any notion of commitment.

Should your company use this marketing software over this one? Who cares? Use both, allow yourself to pay a bit more for one month or two, then keep the one that perfectly meets your needs. Going further, why even consider it at company level? One department may prefer one software because it measures what they want in the exact way they want, while another department may prefer another one. With almost no acquisition and no integration costs, why try to over rationalise at the expense of business value and user experience?
Standardisation is still to be considered on non-differentiating applications, but at a much less prominent position.

The Digital highway

All this said, most of old world companies are still considering innovation with the same eyes as before, missing business opportunities and losing ground to new entrants.

If conducting an IoT experiment means running an RFP, conduct a 6-month PoC and sign a multi-year contract, then you may be doing IoT, but you’re still hobbling on the digital highway.

Velocity is key to transforming your company into an ever evolving, fast learning, business.

“He not busy being born is busy dying

Thanks to Clara, Gavin, Jian and Robin for their kind guidance.

More on the subject:

What do you think? Leave a reply below or contact me by email.

Image courtesy of Getty Images

Come on vendors, get it together: Office 365, Google, Dropbox, etc

For many of our customers, large and small, their first foray into the beautiful world of cloud computing is driven by a less beautiful compelling event related to one of the following:

  • On-premise email servers (typically Microsoft Exchange) require an upgrade of either software, hardware or both
  • Licensing and upgrades of the Microsoft Office suite, typically as part of some enterprise-wide licensing agreement (or maybe an audit!)

So, if you are approaching a refresh, what should you do?  There’s a myriad of comparisons out there on the web comparing the features and costs of Microsoft Office 365 against Google Apps so I won’t add to that. To cut to the chase, feature-wise they are approaching parity but of course it depends on the specific requirements of your organisation. What I wanted to cover was the usual corporate dilemma, and why Microsoft is currently (and probably for a long time) the right answer.

The logic goes like this.  I really quite like the idea of using Google Apps, it’s a bit easier to administer (in my view – but partially as it’s less rich and maybe doesn’t expose all the cruft of Exchange in the config pages), it’s just feels a bit more hip and happening. Although to be fair, Microsoft have managed to shed their corporate image and loosen up a bit, as demonstrated by this identification challenge that tickled me when you are registering for Office 365…

But, I really only want to have one vendor and configuration to manage – surely I can get everything I need from one vendor in 2016?  It really depends what line of business you are in, but certainly for us as a professional services-based business some customers will expect materials to be sent to them in Microsoft Office formats (Word, Excel, PowerPoint).  Whilst other vendors can work in this format or I could use LibreOffice, I know that the interoperability just isn’t quite good enough. And my finance team are going to rebel if I don’t give them Excel, so… I really need to buy Microsoft Office – and this is when the costing dimension comes in. Pricing starts at £7.80/month/user for up to 300 users for Office 365 with the ability to download all the Office client applications (jumping to a pre-discount £14.70 for the Enterprise E3) – and that immediately makes a Google Apps-based solution unattractive as you basically have to pay for many of the services twice, e.g. email services, Skype or Hangouts etc from both Microsoft and Google.  A masterstroke of lock-in from Microsoft.

OK, so I accept this as a fact of life, and resign myself to going “all in” with Office 365.  Not so fast.  Like many organisations today, I might have a BYOD or CYOD policy and I know that my users have both PCs and Macs. That sounds fine – Office 365 supports Macs.  Yes, the Mac implementations of Word, Excel etc are a bit different (mainly as a result of the weird double menu bar thing – why have one “Insert” menu when you can have two?) but the apps are pretty good these days.

But the issue comes with file sharing and synchronisation on the Mac. Whilst there is a Mac client for syncing your OneDrive so you can work offline etc, it does not sync shared files – and so the only way you can access them is via the web interface – not something that you are going to enjoy on the train on the way home.  This fact is a little buried here – there’s more evidence of Microsoft’s sense of humour with this statement…

So that leaves us with an issue as to how to support collaborative file sharing across our organisation.  This is what Dropbox (in my experience) does best – it just works across clients.  So you end up having to have at least one other vendor product to plug this functionality gap, which is frustrating.  I was talking to a start up the other day – they are not big but they’ve active subscriptions for all three – i.e. Google Apps (as the email search is the best), Office 365 (as they need the client apps) and DropBox (for the file sharing).  I bet this is much more common than it should be.

But, it’s on Microsoft’s Office 365 roadmap so maybe I’ll be able to have just one subscription in 2017.

If you want to read more on comparisons of Google Apps and Office 365 etc – this is a great resource.


Beamap is the cloud consultancy subsidiary of Sopra Steria

Digital security: battening down the hatches in a sea of data

by Torsten Saemann, Sopra Steria GmbH

Digitalisation without the use of modern technologies? Inconceivable! With cloud computing, the Internet of Things is rapidly becoming part of our everyday life. It seems like magic that we can call up practically everything known to man with tools that fit in our pocket. With a few clicks we can summon items to our front door that are produced at the other end of the world. So far so good. However, nobody seems to be interested in the fact that the technological structures of the digital world are shaky and insecure.

This is precisely how Frank Rieger of the Chaos Computer Club (CCC) sees things. On Spiegel Online he explains the fragility of the foundations of Industry 4.0 by means of the following comparison:

The pillars of the world in the dawning digital age are crumbling. The technologies on which the networking of everyday life and the flows of information that drive the economy are based are more like temporary wooden frames than solid steel constructions. Generally everything functions – provided no-one jolts on the boards or saws through a beam.

Avoid flying blind during digitalisation

These digital wooden frames result in all sorts of security loopholes. They are the result of poorly written software. Programmers make errors – this much we know. However, it is frequently the case that IT management in German companies is, consciously or sub-consciously, heading towards unknown risks. Our study on the topic of digital security proves this. One third of all IT decision-makers in Germany are even implementing technologies when the IT risks are completely unknown.

Dr. Gerald Spiegel, Head of Information Security Solutions at Sopra Steria Consulting finds this insight shocking: “The fact that such a large number of IT decision-makers are, as it were, flying blind in their approach to digitalisation is worrying. The behaviour within the manufacturing sector is particularly rash – and this in spite of the fact that industrial plants increasingly fall victim to cyber attacks.” The prospects facing a digitalised economy are far from good if German companies are exposed to the danger of cyber attacks, in some cases with no protection whatsoever.

Digital negligence in German companies

The lack of initiative in many companies when it comes to protection against cyber attacks is disastrous. According to our study, this is the opinion of 85 percent of IT decision-makers. The fact that it is in particular board members and managing directors that play down the risk of cyber attacks is, given the liability risk, incomprehensible. Here the companies are fully aware of the digital weak points. And it is conceivable that their dependency on digital systems will continue to grow exponentially. Maintaining a high rate of innovation while simultaneously reducing IT costs just doesn’t work.

Adjusting investment in digital security to suit the rate of innovation

But how can you convert wooden structures into steel? When driving forward the digitalisation and automation of processes, companies should err on the side of caution. This includes pushing the introduction and implementation of a company-wide IT security strategy. This strategy must lay out the most important information security objectives and the principles for their implementation.

The IT security strategy should also address trends and new technologies. And this must take place on a continuous basis. The IT department must ensure that a security concept is submitted to the specialist department prior to an application or IT system “going live”. Furthermore, security-relevant programming errors can be avoided through the use of secure programming languages. Penetration tests for applications and IT systems – following a release change for example – are another important security component.

Digital excellence built on digital security

The digitalisation of the economy brings with it new and far-reaching challenges regarding the digital security within a company. Cyber attacks on IT infrastructures are becoming increasingly more complex and professionally executed. And they happen on a daily basis. Defensive measures are costly and require time. However, they are beneficial and necessary. Promoting a slower, but more digitally secure approach within IT departments and in front of board members certainly isn’t cool, but in the long term it is definitely the better strategy.

What are your thoughts? Leave a reply below or contact me by email.

Discover more about our experience in delivering secure services to protect information, applications, infrastructures and people.

Bridging the gap between Google Cloud Platform and enterprises

Recently, I spent some time with Google to understand their Google Cloud Platform (GCP) in more detail. Everyone knows that the leaders (in adoption terms at least) in this space are AWS and MS Azure so I thought it would be interesting to hear about GCP and Google’s own cloud journey.

GCP was started in 2008 and like with AWS, Google’s objective was to bring best practices used internally to the external market.  According to Google, most of their internal tools are very engineering focused so their challenge was to ensure that GCP was fast and easily consumable for an external market.

Here are my key observations:-

GCP is focussing on enterprises entering on their 2nd wave cloud journey

The IaaS space is a competitive market and Google acknowledges this. Google’s messaging is that cloud is all about what you can do with the platform and a key objective for GCP is to process large volumes of data quickly (like the Google Search Engine). They don’t really like the term ‘big data’ as they see all things as data. Their view it’s the speed at which you can process data that’s their real USP, leveraging GCP services like BigQuery and Cloud Bigtable. Google’s view is that innovation comes from what you can do with the data. For enterprises sitting on large volumes of data, GCP gives the ability to improve internal processes and it provides a new opportunity to develop and sell new services.

Containers are the way forward for new modern application development

Google has been using the containers for many years. Everything in Google runs in containers (managed via Kubernetes) and they see this as the future for improving application development efficiency for enterprises. However, they understand the huge gap between the sophistication of what they do internally and enterprises.

When developers at Google code, they do not think about servers, as it is more of a “serverless” computing environment. Scaling up is no longer an issue, so their focus is about functionality and innovation.  This is where enterprises want to be for infrastructure and new services, but it’s going to be a long journey.

Do enterprises want to be like Google?

In short, yes – in terms of speed and innovation. However, most mature enterprises have at least decades of legacy applications, infrastructure and strict governance so therefore it is difficult to be agile. Google understands that enterprises cannot operate in their unique manner either technically or culturally. GCP isn’t about turning your enterprise into Google – it is simply about enabling enterprises to leverage services in a more efficient way.

An example given during the presentation was that many years ago there were multiple search engines (Yahoo, Altavista, Excite etc). However, Google’s USP was to process data accurately and quickly using a simple UI. This was disruptive in the market place as they changed the way data was queried, processed – and how it was managed.Therefore, a lesson for enterprises is that new digital initiatives will always require new ways of thinking (forget the 20 years of legacy infrastructure and process) and using cloud platforms to develop new services could be game changing in their markets

GCP is still in battle for the enterprise market with other cloud providers

Cloud is all about innovation and this is GCP’s play!

With enterprises going “all in” with AWS or Azure, another cloud platform may just make things more complex; however I can see the value of GCP in its speed, machine learning and data processing capabilities. Google may find it challenging to persuade enterprises to use GCP if their workforce is already trained in Azure or AWS. Enterprises like to remain with a platform just because their workforce has the skills – inertia is a powerful force.

Unlike Microsoft, Google do not currently have the enterprise relationships. However neither did AWS and they are making great progress in that portion of their market. Therefore, Google’s partner channel needs to broaden out to help drive adoption.  Google are also hiring people with more of an enterprise background so they can better understand the psyche of these customers.

Questions around Google’s ability to support large-scale enterprise customers will remain however; some years back the same questions were asked about AWS and now look at their portfolio of enterprise customers.  Currently, GCP may not have the market share of AWS or Azure though it definitely has a platform rich in interesting features, which will help Google narrow the gap within the enterprise market.  An open question is whether their focus on relatively niche innovation features will present a broad enough portfolio of services to enterprise customers so that GCP is seen as a credible “all in” choice, or just a niche big data service provider.

What do you think? Leave a reply below or contact me by email

Teachable Brand AI – a new form of personalised retail customer experience?

Within the next five years, scalable artificial intelligence in the cloud – Brand AI – could potentially transform how retailers use personalisation to make every store visit a memorable, exclusive customer experience distinct from anything a competing digital disruptor could offer.

Arguably the success of this engagement approach is contingent upon a retailer’s ability to combine a range of data sources (such as social media behaviour, loyalty card history, product feedback) with its analytics capabilities to create personalised moments of delight in-store dynamically for an individual customer that drives their decision to purchase.

But could the truly disruptive approach be one where a customer is continually teaching the Brand AI directly about their wants or needs as part of their long-term personal relationship with a retailer?

Could this deliver new forms of customer intimacy online competitors can’t imitate? Here are some ideas…

  • Pre visit: Using an existing instant messaging app the customer likes (such as WhatsApp or Skype), he or she tells the Brand AI about their communication preferences (time, date, etc) and what content about a specific retailer’s products or services (such as promotions or new releases) they are interested in. This ongoing relationship can be changed any time by the customer and be pro-active or reactive – the customer may set the preference that the Brand AI only engages them when they are located within a mile of a retailer’s store or one week before a family member’s birthday, for example. Teachable Brand AI empowers the customer to be in complete control of their own personalised journey with a retailer’s brand.
  • In store: The Brand AI can communicate directly with in-store sales staff about a customer’s wants or needs that specific day to maximise the value of this human interaction, provide on-the-spot guidance and critical feedback about physical products their customer is browsing to drive a purchasing decision, or dynamically tailor/customise in-store digital experiences such as virtual reality or media walls to create genuine moments of customer delight. Teachable Brand AI has learned directly from the customer about what excites them and uses this deep insight to deliver a highly differentiated, in-store experience online competitors can’t imitate.
  • Post purchase: The customer can ask the Brand AI to register any warranties, guarantees or other after sales support or offers for their purchased good automatically. In addition, the customer can ask the Brand AI to arrange to return the good if unsatisfied or found faulty – to help ensure revenue retention a replacement or alternative is immediately suggested that can be exchanged at the customer’s own home or other convenient location. The customer can also share any feedback they want about their purchase at any time – Teachable Brand AI is driving customer retention and also gathering further data and insights to enable greater personalisation of the pre visit and in-store experience.

If you would like more information about how big data and analytics can benefit your organisation please contact the Sopra Steria Digital Practice.

 

What do recent AWS announcements tell us about the cloud market?

As always, Amazon Web Services (AWS) made a bunch of announcements at their recent Chicago Summit.  The new features have been reported to death elsewhere so I won’t repeat that, but there were a few observations that struck me about them…

Firstly, the two new EBS storage volume types aimed at high throughout rather than IOPS – are 50% and 25% of the normal SSD EBS price, so are effectively a price cut for big data users.  As I’ve commented before, the age of big headline grabbing “across the board” cloud price reductions is largely over – and now the price reductions tend to come in the form of better price/performance characteristics.  In fact, this seems to be one of Google’s main competitive attacks on AWS.

Of course, I welcome the extra flexibility – it’s always comforting to have more tools in the toolbox.  And to be fair, there is a nice table in the AWS blog post that gives good guidance on when to use each option.  Other cloud vendors are introducing design complexity for well-meaning reasons also, e.g. see Google’s custom machine types.

What strikes me about this is that the job of architecting a public cloud solution is getting more and more complex and requires deeper knowledge and skills, i.e. the opposite of the promise of PaaS.  You need a deeper and deeper understanding of the IOPS and throughout needs of your workload, and its memory and CPU requirements.  In a magic PaaS world you’d just leave all this infrastructure design nonsense to the “platform” to make an optimised decision on.  Maybe a logical extension of AWS’s direction of travel here is to potentially offer an auto-tiered EBS storage model, where the throughput and IOPS characteristics of the EBS volume type is dynamically modified based upon workload behaviour patterns (similar to something that on-premise storage systems have been doing for a long time).  And auto-tiered CPU/memory allocation would also be possible (with the right governance).  This would take away some more of theundifferentiated heavy lifting that AWS try and avoid for their customers.

So…related to that point about PaaS – another recent announcement was that Elastic Beanstalk now supports automatic weekly updates for minor patches/updates to the stack that it auto-deploys for you, e.g. for patches on the web server etc.  It then runs confidence tests that you define before swapping over traffic from the old to the new deployment.  This is probably good enough for most new apps, and moves the patching burden to AWS, away from the operations team.  This is potentially very significant I think –  and it’s in that fuzzy area where IaaS stops and PaaS starts.  I must confess to having not used Elastic Beanstalk much in the past, sticking to the mantra that I “need more control” etc and so going straight to CloudFormation.  I see customers doing the same thing.  As more and more apps are designed with cloud deployment in mind and use cloud-friendly software stacks, I can’t see any good reason why this dull but important patching work cannot be delegated to the cloud service provider, and for a significant operations cost saving.  Going forward, where SaaS is not an appropriate option, this should be a key design and procurement criteria in enterprise software deployments.

Finally, the last announcement that caught my eye was the AWS Application Discovery service – another small tack in the coffin of SI business models based on making some of their money from large scale application estate assessments.  It’s not live yet and I’m not clear on the pricing (maybe only available via AWS and their partners), and probably it’ll not be mature enough to use when it is first released.  It will also have some barriers to use, not least that it requires an on-premise install and so will need to be approved by a customer’s operations and security teams – but it’s a sign of the times and the way it’s going.  Obviously AWS want customers to go “all in” and migrate everything including the kitchen sink and then shut down the data centre, but the reality from our work with large global enterprise customers is that the business case for application migrations rarely stacks up unless there is some other compelling event (e.g. such as a data centre contract expiring).  However, along with the database migration service etc, they are steadily removing the hurdles to migrations, making those business cases that are marginal just that little bit more appealing…

What are your thoughts? Leave a reply below, or contact me by email.