Leroy Merlin: Transforming the Russian home improvement market with APIs

Editors note:Today we hear from Sergei Lega, enterprise architect at Leroy Merlin Russia, a retail chain specializing in the sale of products for construction, decoration and home furnishing. Read on to learn how Leroy Merlin is using APIs and API management to simplify how partners integrate with its services.Leroy Merlin is expanding our network of retail stores rapidly in Russia, and as part of this expansion we are undertaking a digital transformation. Not only has this process tested our technological capabilities, but it also presents us with the challenge of transforming our mindset. To offer expanded services to our customers, we rely on a rich set of APIs and microservices created and managed in Google Cloud’s Apigee API Platform.Leroy Merlin Russia sells products for construction, home decoration, and furnishing. As a DIY-focused retailer, we see a great opportunity for differentiating ourselves in the marketplace by expanding the types of services we can offer our customers beyond just the sale of our products. We currently have more than 70 partners around Russia focused on three use cases: window installation, kitchen installation, and professional building materials. These partners offer customers and building professionals access to services that enhance their Leroy Merlin customer journey.But we wanted to make it even simpler and more seamless for customers to access these services. That required a clearly defined API strategy. We now offer a set of endpoints, built from microservices and exposed as APIs, that allow us to securely share pricing, inventory, and product information, along with payment services. These services let us connect our platform and services with all the third-party merchants in our ecosystem; they can easily get onboarded, and then upload and synchronize their product databases to the Leroy Merlin Marketplace quickly, and in a scalable environment.Now, when a customer purchases windows online or from one of our stores, they can continue their journey by acquiring necessary measurement and installation services at the same time, even though these services might be provided by one of our partners. The same goes for kitchen installation, which typically requires a complex set of services like plumbing and electricity that the customer would normally need to source independently.When Apigee announced its Istio integration in 2018, we knew that we could simplify and manage our exposure of microservices from an Istio mesh by adding API management capabilities via Istio’s native configuration mechanism. At the moment, we’re using Istio in a few Kubernetes instances, which makes sharing these services inside our development team—and our ability to consume them—much simpler.Apigee’s API management policies and reporting can be applied to any service, so management policies such as API key validation, quota enforcement, and JSON web token validation can be easily controlled from the Apigee UI. In the future, we plan to extend Istio company-wide as a cornerstone of our microservices management, which will provide us with very granular control of traffic flows and access policies. It will also give us 360-degree monitoring and security capabilities, along with service discovery in a multi-cluster environment.Many of our roughly 100 APIs are exposed to third-party developers, but some are exposed internally as well; we are working to make Apigee the focal point for integrations and new service development inside the company. As we continue to develop microservices and attract new developers to our marketplace, we are keeping a mindset of APIs as products, which reflects our customer journey-focused strategy. By the end of 2019, we expect to finish our developer platform and achieve full usability, and at that point we will really begin to scale our ecosystem and start to visualize concrete benefits for Leroy Merlin Russia, our customers, and our partners.Our API journey is all about maximizing connectivity and agility with an API-first architecture in seamless partnership with our partners nationwide. So far, Apigee has been a great partner on this journey.Learn more about API management on Google Cloud by visiting our Apigee page.
Quelle: Google Cloud Platform

Building hybrid blockchain/cloud applications with Ethereum and Google Cloud

Adoption of blockchain protocols and technologies can be accelerated by integrating with modern internet resources and public cloud services. In this blog post, we describe a few applications of making internet-hosted data available inside an immutable public blockchain: placing BigQuery data available on-chain using a Chainlink oracle smart contract. Possible applications are innumerable, but we’ve focused this post on a few that we think are of high and immediate utility: prediction marketplaces, futures contracts, and transaction privacy.Hybrid cloud-blockchain applicationsBlockchains focus on mathematical effort to create a shared consensus. Ideas quickly sprang up to extend this model to allow party-to-party agreements, i.e. contracts. This concept of smart contracts was first described in a 1997 article by computer scientist Nick Szabo. An early example of inscribing agreements into blocks was popularized by efforts such as Colored Coins on the Bitcoin blockchain.Smart contracts are embedded into the source of truth of the blockchain, and are therefore effectively immutable after they’re a few blocks deep. This provides a mechanism to allow participants to commit crypto-economic resources to an agreement with a counterparty, and to trust that contract terms will be enforced automatically and without requiring third party execution or arbitration, if desired.But none of this addresses a fundamental issue: where to get the variables with which the contract is evaluated. If the data are not derived from recently added on-chain data, a trusted source of external data is required. Such a source is called an oracle.In previous work, we made public blockchain data freely available in BigQuery through the Google Cloud Public Datasets Program for eight different cryptocurrencies. In this article, we’ll refer to that work as Google’s crypto public datasets. You can find more details and samples of these datasets in the GCP Marketplace. This dataset resource has resulted in a number of GCP customers developing business processes based on automated analysis of the indexed blockchain data, such as SaaS profit sharing, mitigating service abuse by characterizing network participants, and using static analysis techniques to detectsoftware vulnerabilities and malware. However, these applications share a common attribute: they’re all using the crypto public datasets as an input to an off-chain business process.In contrast, a business process implemented as a smart contract is performed on-chain, and that is of limited utility without having access to off-chain inputs. To close the loop and allow bidirectional interoperation, we need to be not only making blockchain data programmatically available to cloud services, but also cloud services programmatically available on-chain to smart contracts.Below, we’ll demonstrate how a specific smart contract platform (Ethereum) can interoperate with our enterprise cloud data warehouse (BigQuery) via oracle middleware (Chainlink). This assembly of components allows a smart contract to take action based on data retrieved from an on-chain query to the internet-hosted data warehouse. Our examples generalize to a pattern of hybrid cloud-blockchain applications in which smart contracts can efficiently delegate to cloud resources to perform complex operations. We will explore other examples of this pattern in future blog posts.How we built itAt a high level, Ethereum Dapps (i.e. smart contract applications) request data from Chainlink, which in turn retrieves data from a web service built with Google App Engine and BigQuery.To retrieve data from BigQuery, a Dapp invokes the Chainlink oracle contract and includes payment for the parameterized request to be serviced (e.g. gas price at a specified point in time). One or more Chainlink nodes are listening for these calls, and upon observing, one executes the requested job. External adapters are service-oriented modules that extend the capability of the Chainlink node to authenticated APIs, payment gateways, and external blockchains. In this case, the Chainlink node interacts with a purpose-built App Engine web service.On GCP, we implemented a web service using the App Engine Standard Environment. We chose App Engine for its low cost, high scalability, and serverless deployment model. App Engine retrieves data from BigQuery, which hosts the public cryptocurrency datasets. The data we’ve made available are from canned queries, i.e. we aren’t allowing arbitrary data to be requested from BigQuery, but only the results of parameterized queries. Specifically, an application can request the average gas price for either (A) a particular Ethereum block number, or (B) a particular calendar date.After a successful response from the web service, the Chainlink node invokes the Chainlink oracle contract with the returned data, which in turn invokes the Dapp contract and thus triggers execution of downstream Dapp-specific business logic. This is depicted in the figure below.For details on integrating your Dapp, please see our documentation for requesting data from BigQuery via Chainlink. Illustrative queries to BigQuery can be seen for gas price by date and by block number.How to use the BigQuery Chainlink oracleIn this section we’ll describe how useful applications can be built using Google Cloud and Chainlink.Use case 1: Prediction marketplacesParticipants in prediction marketplaces allocate capital to speculate on future events in general. One area of intense interest is which smart contract platform will predominate because, being networks ecosystems, their value will follow a power law (i.e. winner-take-all) distribution. There are many differing opinions about which platform will succeed, as well as how success can be quantified.By using the crypto public datasets, it’s possible for even complex predictions like the recent $500,000 bet about Ethereum’s future state to be settled successfully on-chain. We’ve also documented how the variety, volume, recency, and frequency of Dapp utilization can be measured by retrieving 1-, 7-, and 30-day activity for a specific Dapp.These metrics are known as daily-, weekly-, and monthly-active users and are frequently used by web analytics and mobile app analytics professionals to assess website and app and success.Use case 2: Hedging against blockchain platform riskThe decentralized finance movement is rapidly gaining adoption due to its successful reinvention of the existing financial system in blockchain environments which, on a technical basis, are more trustworthy and transparent than current systems.Financial contracts like futures and options were originally developed to enable enterprises to reduce/hedge their risk related to resources critical to their operation. Similarly, data about on-chain activity such as average gas prices, can be used to create simple financial instruments that provide payouts to their holders in cases where gas prices rise too high. Other qualities of a blockchain network, e.g. block times and/or miner centralization, create risks that Dapp developers want to protect themselves against. By bringing high quality data from the crypto public datasets to financial smart contracts, Dapp developers’ risk exposure can be reduced. The net result is more innovation and accelerated blockchain adoption.We’ve documented how an Ethereum smart contract can interact with the BigQuery oracle to retrieve gas price data at a particular point in time. We’ve also implemented a stub of a smart contract option showing how the oracle can be used to implement a collateralized contract on future gas prices, a critical input for a Dapp to function.Use Case 3: Enabling commit/reveals across Ethereum using submarine sendsOne of the commonly mentioned limitations in Ethereum itself is a lack of transaction privacy, creating the ability for adversaries to take advantage of on-chain data leakage to exploit users of commonly used smart contracts. This can take the form of front-running transactions involving distributed exchange (DEx) addresses. As described in To Sink Frontrunners, Send in the Submarines, the problem of front-running plagues all current DExs and slows down the Decentralized Finance movement’s progress, as exchanges are a key component of many DeFi products/applications.By using the submarine sends approach, smart contract users can increase the privacy of their transactions, successfully avoiding adversaries that want to front-run them, making DExs more immediately useful. Though this approach is uniquely useful in stopping malicious behavior like front-running, it also has its own limitations, if done without an oracle.Implementing submarine sends without an oracle produces blockchain bloat. Specifically, the Ethereum virtual machine allows a contract to see at maximum 256 blocks upstream in the chain, or approximately one hour. This maximum scope limits the practical usefulness of submarine sends because it creates unnecessary denormalization when rebroadcasting of data is required. In contrast, by implementing submarine sends with an oracle, bloat is eliminated because operating scope is increased to include all historical chain data.ConclusionWe’ve demonstrated how to use Chainlink services to provide data from the BigQuery crypto public datasets on-chain.This technique can be used to reduce inefficiencies (submarine sends use case) and in some cases add entirely new capabilities (hedging use case) to Ethereum smart contracts, enabling new on-chain business models to emerge (prediction markets use case).The essence of our approach is to trade a small amount of latency and transaction overhead for a potentially large amount of economic utility. As a concrete example, ordinary submarine sends require on-chain storage that scales O(n) with blocks added to the blockchain, but can be reduced to O(1) if the calling contract waits an extra two blocks to call the BigQuery oracle.We anticipate that this interoperability technique will lead developers to create hybrid applications that take the best of what smart contract platforms and cloud platforms have to offer. We’re particularly interested in bringing Google Cloud Platform’s ML services (e.g. AutoML and Inference APIs).By allowing reference to on-chain data that is out of scope, we improve the operational efficiency of the smart contract platform. In the case of submarine sends, storage consumption that scales O(n) with block height is reduced to O(1), at the trade-off cost of additional transactional latency to interact with an oracle contract.
Quelle: Google Cloud Platform

Sharing enthusiasm from the cloud community for our Looker acquisition

In the week since we announced our intent to acquire Looker, a unified platform for business intelligence, data applications and embedded analytics, we’ve heard from many customers, partners, and industry analysts that are enthusiastic about our decision to provide customers with a comprehensive analytics solution. By combining Looker’s robust business intelligence and analytics platform with BigQuery, our enterprise data warehouse, customers can solve more business challenges, faster—all while remaining in complete control of their data.Here are a few of the many responses we’ve heard about the addition of the Looker analytics platform to our portfolio:“As we serve the evolving needs of our customers, it’s critical for us to empower our teams with information,” said Barbara Sanders, VP and Chief Architect at The Home Depot. “BigQuery and Looker quickly provides our engineering teams with operational data and visualizations to help identify application or infrastructure issues that could impact the customer experience.””Our data platform provides a fully managed service that makes it easy and cost-effective to create, manage and scale advanced analytics capabilities for advertisers and marketers,” said Iain Niven-Bowling, EVP, 2Sixty and Essence/WPP.  “The combination of BigQuery and Looker provide the underlying technology and Google’s acquisition only strengthens this and enables us to continue to build highly valued products on top of this robust end to end solution.”“The data analytics market is rapidly evolving and changing, and 451 Research has identified that businesses who successfully turn data into insight via analytics have a competitive advantage,” said Matt Aslett, Research Vice President, 451 Research. “The acquisition of Looker is a key move for Google Cloud that will increase the value it can provide to customers across its six key industries.”“The acquisition of Looker makes sense for Google Cloud and for their customers,” said Anil Chakravarthy, CEO at Informatica. “Looker’s rich analytics platform complements Google Cloud’s high-scale infrastructure and digital transformation capabilities. As the leader in enterprise cloud data management, we talk to a lot of customers who want to get more value from their data, and we believe every business interested in cloud analytics should be excited about this acquisition.”“Looker provides Google Cloud the ability to provide advanced analytics, visualization, and insight generation to all parts of an organization,” said Tom Galizia, Principal, Deloitte Consulting LLP. “It shows continued commitment and alignment to Google Cloud’s vision of democratizing and supercharging their customer’s data and information. Google organized the world’s information and now they want to do the same for the enterprise.”“Google Cloud is building an end-to-end platform for enterprise transformation and its acquisition of Looker will help bring more complete business intelligence, analytics and visualization capabilities to its customers,” said Sanjeev Vohra, group technology officer and data business group lead at Accenture. “Looker’s support for multiple public clouds and databases aligns well with Google’s multi-cloud approach and we look forward to working with our enterprise clients to implement these capabilities at scale.”I’m personally very excited about all the ways bringing Google Cloud and Looker together can help our customers. We share a common philosophy around solving business problems for customers across all industries while also supporting our customers where they are, be it on Google Cloud, in other public clouds, or on premises. I look forward to sharing more once the deal closes.
Quelle: Google Cloud Platform