Designchef: Jony Ive verlässt Apple

Apple-Chef Tim Cook hat einen seiner wichtigsten Mitarbeiter verloren. Jonathan ‘Jony’ Ive verlässt das Unternehmen nach fast 30 Jahren, um seine eigene Firma zu eröffnen. Apple wird er jedoch verbunden bleiben. (Jonathan Ive, Apple)
Quelle: Golem

A Secure Content Workflow from Docker Hub to DTR

Docker Hub is home to the world’s largest library of container images. Millions of individual developers rely on Docker Hub for official and certified container images provided by independent software vendors (ISV) and the countless contributions shared by community developers and open source projects. Large enterprises can benefit from the curated content in Docker Hub by building on top of previous innovations, but these organizations often require greater control over what images are used and where they ultimately live (typically behind a firewall in a data center or cloud-based infrastructure). For these companies, building a secure content engine between Docker Hub and Docker Trusted Registry (DTR) provides the best of both worlds – an automated way to access and “download” fresh, approved content to a trusted registry that they control.
Ultimately, the Hub-to-DTR workflow gives developers a fresh source of validated and secure content to support a diverse set of application stacks and infrastructures; all while staying compliant with corporate standards. Here is an example of how this is executed in Docker Enterprise 3.0:

Image Mirroring
DTR allows customers to set up a mirror to grab content from a Hub repository by constantly polling it and pulling new image tags as they are pushed. This ensures that fresh images are replicated across any number of registries in multiple clusters, putting the latest content right where it’s needed while avoiding network bottlenecks.
 

Access Controls
Advanced access controls let organizations to set permissions in DTR at a very granular level – down to the API. Images from Docker Hub can be mirrored into a restricted repository in DTR with access given only to qualified content administrators. The role of the content administrator is to ensure that the images meet the company’s policies.
Image Scanning
Once in the restricted repository, content administrators can set up automated vulnerability scanning which gives organization fine-grained visibility and control over the software and libraries that are being used. These binary-level scans compare the images and applications against the NIST CVE database to identify exposure to known security threats, providing organizations a chance to review and approve images before making them available to developers.
Policy-Based Image Promotion:
With DTR, content administrators can set up rules-based image promotion pipelines that automate the flow approved images to developer repository. (E.g. “Promote Image to Target if Vulnerability Scan shows Zero Major Vulnerabilities”.) This streamlines the development and delivery pipeline while enforcing security controls that automatically gate images, ensuring only approved content gets used by developers.

Image Signing
Digital signatures are used to verify both the contents and publisher of images, ensuring their integrity. Customers can also take this a step further by requiring signatures from specific users before images are deployed, providing an additional layer of trust. This allows content administrators to validate that they have approved images in the developer repositories. Developers and CI tools can apply signatures as well.
End-to-End Automation
The entire workflow outlined above can be automated within Docker Enterprise 3.0 – from image mirroring, to vulnerability scans that are triggered based on new content, to promotion policies and even the CI workflows that add digital signatures. This end-to-end automation enables enterprise developers to innovate on top of the vast content available in Docker Hub, while adhering to secure corporate standards and practices.

Learn how to combine @Docker Hub and Docker Trusted Registry with #Docker Enterprise 3.0 for a secure content workflowClick To Tweet

 
To learn more about Docker Enterprise 3.0:

Register for the Upcoming Docker Enterprise 3.0 Virtual Event
Try Docker Enterprise for yourself trial.docker.com 
Learn more about Docker Trusted Registry

The post A Secure Content Workflow from Docker Hub to DTR appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Leveraging complex data to build advanced search applications with Azure Search

Data is rarely simple. Not every piece of data we have can fit nicely into a single Excel worksheet of rows and columns. Data has many diverse relationships such as the multiple locations and phone numbers for a single customer or multiple authors and genres of a single book. Of course, relationships typically are even more complex than this, and as we start to leverage AI to understand our data the additional learnings we get only add to the complexity of relationships. For that reason, expecting customers to have to flatten the data so it can be searched and explored is often unrealistic. We heard this often and it quickly became our number one most requested Azure Search feature. Because of this we were excited to announce the general availability of complex types support in Azure Search. In this post, I want to take some time to explain what complex types adds to Azure Search and the kinds of things you can build using this capability. 

Azure Search is a platform as a service that helps developers create their own cloud search solutions.

What is complex data?

Complex data consists of data that includes hierarchical or nested substructures that do not break down neatly into a tabular rowset. For example a book with multiple authors, where each author can have multiple attributes, can’t be represented as a single row of data unless there is a way to model the authors as a collection of objects. Complex types provide this capability, and they can be used when the data cannot be modeled in simple field structures such as strings or integers.

Complex types applicability

At Microsoft Build 2019,  we demonstrated how complex types could be leveraged to build out an effective search application. In the session we looked at the Travel Stack Exchange site, one of the many online communities supported by StackExchange.

The StackExchange data was modeled in a JSON structure to allow easy ingestion it into Azure Search. If we look at the first post made to this site and focus on the first few fields, we see that all of them can be modeled using simple datatypes, including tags which can be modeled as a collection, or array of strings.

{
"id": "1",
"CreationDate": "2011-06-21T20:19:34.73",
"Score": 8,
"ViewCount": 462,
"BodyHTML": "<p>My fiancée and I are looking for a good Caribbean cruise in October and were wondering which
"Body": "my fiancée and i are looking for a good caribbean cruise in october and were wondering which islands
"OwnerUserId": 9,
"LastEditorUserId": 101,
"LastEditDate": "2011-12-28T21:36:43.91",
"LastActivityDate": "2012-05-24T14:52:14.76",
"Title": "What are some Caribbean cruises for October?",
"Tags": [
"caribbean",
"cruising",
"vacations"
],
"AnswerCount": 4,
"CommentCount": 4,
"CloseDate": "0001-01-01T00:00:00",​

However, as we look further down this dataset we see that the data quickly gets more complex and cannot be mapped into a flat structure. For example, there can be numerous comments and answers associated with a single document.  Even votes is defined here as a complex type (although technically it could have been flattened, but that would add work to transform the data).

"CloseDate": "0001-01-01T00:00:00",
"Comments": [
{
"Score": 0,
"Text": "To help with the cruise line question: Where are you located? My wife and I live in New Orlea
"CreationDate": "2011-06-21T20:25:14.257",
"UserId": 12
},
{
"Score": 0,
"Text": "Toronto, Ontario. We can fly out of anywhere though.",
"CreationDate": "2011-06-21T20:27:35.3",
"UserId": 9
},
{
"Score": 3,
"Text": ""Best" for what? Please read [this page](http://travel.stackexchange.com/questions/how-to
"UserId": 20
},
{
"Score": 2,
"Text": "What do you want out of a cruise? To relax on a boat? To visit islands? Culture? Adventure?
"CreationDate": "2011-06-24T05:07:16.643",
"UserId": 65
}
],
"Votes": {
"UpVotes": 10,
"DownVotes": 2
},
"Answers": [
{
"IsAcceptedAnswer": "True",
"Body": "This is less than an answer, but more than a comment…nnA large percentage of your travel b
"Score": 7,
"CreationDate": "2011-06-24T05:12:01.133",
"OwnerUserId": 74

All of this data is important to the search experience. For example, you might want to:

Search for and highlight phrases not only in the original question, but also in any of the comments.
Limit documents to those where an answer was provided by a specific user.
Boost certain documents higher in the search results when they have a higher number of up votes.

In fact, we could even improve on the existing StackExchange search interface by leveraging Cognitive Search to extract key phrases from the answers to supply potential phrases for autocomplete as the user types in the search box.

All of this is now possible because not only can you map this data to a complex structure, but the search queries can support this enhanced structure to help build out a better search experience.

Next Steps

If you would like to learn more about Azure Search complex types, please visit the documentation, or check out the video and associated code I made which digs into this Travel StackExchange data in more detail.
Quelle: Azure

Azure Blockchain Workbench 1.7.0 integration with Azure Blockchain Service

We’re excited to share the release of Microsoft Azure Blockchain Workbench 1.7.0, which along with our new Azure Blockchain Service, can further enhance your blockchain development and projects. You can deploy a new instance of Blockchain Workbench through the Azure portal or upgrade your existing deployments to 1.7.0 using the upgrade script. 

This update includes the following improvements:

Integration with Azure Blockchain Service

With the Azure Blockchain Service now in preview, you can develop directly with Blockchain Workbench on Azure Blockchain Service as the underlying blockchain. For those of you who have been on this blockchain journey with Microsoft, there are now templates in Azure which make it faster to configure and deploy a private blockchain network, but it’s still up to you to maintain and run your blockchain nodes, including upgrading to new versions, installing security patches, and more. Azure Blockchain Service simplifies the maintenance of the underlying blockchain network by running a fully managed blockchain node for you.

 

 

Blockchain Workbench helps with building the scaffolding needed on top of a blockchain network to quickly iterate and develop blockchain solutions. Workbench 1.7.0 enables you to easily deploy the Azure Blockchain Service directly with Workbench. To deploy Workbench from the Azure Marketplace, navigate to the Advanced settings blade and select Create new blockchain network under Blockchain settings.

 

Selecting this option will automatically deploy an Azure Blockchain Service node for you. Note that if you rotate the primary API key on the primary transaction node on your Azure Blockchain Service, you need to change the key of the configured RPC endpoint on Blockchain Workbench. Update the Key Vault with the new key and reboot the VMs.

Enhanced compatibility with Quorum

One of the highly requested features from customers is adding compatibility for additional blockchain network protocols. In previous releases of Blockchain Workbench, the default blockchain network that is configured is an Ethereum Proof-of-Authority (PoA) blockchain network. With Blockchain Workbench 1.7.0, we have added compatibility with the Quorum blockchain network.

For customers who are looking to build blockchain applications on top of Quorum, you can now develop and build your Quorum based applications directly with Blockchain Workbench.

You can stay up to date on Azure Blockchain Service by following the team on Twitter @MSFTBlockchain. Please use the Blockchain UserVoice to provide feedback and suggest features and ideas. Your input is helping make this a great service. We look forward to hearing from you.
Quelle: Azure

A solution to manage policy administration from end to end

Legacy systems can be a nightmare for any business to maintain. In the insurance industry, carriers struggle not only to maintain these systems but to modify and extend them to support new business initiatives. The insurance business is complex, every state and nation has its own unique set of rules, regulations, and demographics. Creating new products such as an automobile policy has traditionally required the coordination of many different processes, systems, and people. These monolithic systems traditionally used to create new products are inflexible and creating a new product can be an expensive proposition.

The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we describe how one Microsoft partner, Sunlight Solutions, uses Azure to solve a unique problem.

Monolithic systems and their problems

Insurers have long been restricted by complex digital ecosystems created by single-service solutions. Those tasked with maintaining such legacy, monolithic systems struggle as the system ages and becomes more unwieldy. Upgrades and enhancements often require significant new development, large teams, and long-term planning which are expensive, unrealistic, and a drain on morale. Worse, they restrict businesses from pursuing new and exciting opportunities.

A flexible but dedicated solution

An alternative is a single solution provider that is well versed in the insurance business but able to create a dedicated and flexible solution, one that overcomes the problems of a monolith. Sunlight is such a provider. It allows insurance carriers to leverage the benefits of receiving end-to-end insurance administration functionality from a single vendor. At the same time, their solution provides greater flexibility, speed-to-market, and fewer relationships to manage with lower integration costs.

Sunlight’s solution is a single system which manages end-to-end functionality across policy, billing, claims, forms management, customer/producer CRM, reporting and much more. According to Sunlight:

“We are highly flexible, managed through configuration rather than development. This allows for rapid speed to market for the initial deployment and complete flexibility when you need to make changes or support new business initiatives. Our efficient host and continuous delivery models address many of the industry’s largest challenges with respect to managing the cost and time associated with implementation, upgrades, and product maintenance.”

In order to achieve their goals of being quick but pliable, the architecture of the solution is literally a mixture of static and dynamic components. Static components are fields that do not change. Dynamic components such as lists populate at run time. This is conveyed in the graphic below, the solution uses static elements but lets users configure with dynamic parts as needed. The result is a faster cycle that maintains familiarity but allows a variety of data types.

In the figure above, data appears depending on the product. When products are acquired, for example through mergers, the static data can be mapped. If a tab exists for the product, it appears. For example, “benefits” and “deductibles” are not a part of every product.

Benefits

In brief, here are the key gains made by using Sunlight:

End-to-end functionality: Supports all products/coverages/lines of business
Cloud-based and accessible anywhere
Supports multiple languages and currencies
Globally configurable for international taxes and regional regulatory controls
Highly configurable by non-IT personnel
Reasonable price-point

Azure services

Azure Virtual Machines are used to implement the entire project life cycle quickly.
Azure Security Center provides a complete and dynamic infrastructure that continuously improves on its own.
Azure Site Recovery plans are simple to implement for our production layer.
Azure Functions is utilized in order to quickly replicate environments.
Azure Storage is used to keep the application light with a range of storage options for increased access time based on the storage type.

Next steps

To learn more about other industry solutions, go to the Azure for insurance page. To find more details about this solution, go to Sunlight Enterprise on the Azure Marketplace and select Contact me.
Quelle: Azure

New PCI DSS Azure Blueprint makes compliance simpler

I’m excited to announce our second Azure Blueprint for an important compliance standard with the release of the PCI-DSS v3.2.1 blueprint. The new blueprint maps a core set of policies for Payment Card Industry (PCI) Data Security Standards (DSS) compliance to any Azure deployed architecture, allowing businesses such as retailers to quickly create new environments with compliance built in to the Azure infrastructure.

Azure Blueprints is a free service that enables customers to define a repeatable set of Azure resources that implement and adhere to standards, patterns, and requirements. Azure Blueprints allow customers to set up governed Azure environments that can scale to support production implementations for large-scale migrations.

Azure Blueprints is another reason why Azure is a strong platform for compliance, with the industry’s broadest and deepest portfolio of 91 compliance offerings. Azure is built using some of the most rigorous security and compliance standards in the world, and includes multi-layered security provided by Microsoft across physical datacenters, infrastructure, and operations. Azure is also built for the specific compliance needs of key industries, including over 50 compliance offerings specifically for the retail, health, government, finance, education, manufacturing, and media industries.

Compliance with regulations and standards such as ISO 27001, FedRAMP and SOC is increasingly necessary for all types of organizations, making control mappings to compliance standards a natural application for Azure Blueprints. Azure customers, particularly those in regulated industries, have expressed strong interest in compliance blueprints to help ease their compliance burdens. In March, we announced the ISO 27001 Shared Services blueprint sample which maps a set of foundational Azure infrastructure, such as virtual networks and policies, to specific ISO controls.

The PCI DSS is a global information security standard designed to prevent fraud through increased control of credit card data. Organizations that accept payments from credit cards must follow PCI DSS standards if they accept payment cards from the five major credit card brands. Compliance with PCI DSS is also required for any organization that stores, processes, or transmits payment and cardholder data.

The PCI-DSS v3.2.1 blueprint includes mappings to important PCI DSS controls, including:

Segregation of duties. Manage subscription owner permissions.
Access to networks and network services. Implement role-based access control (RBAC) to manage who has access to Azure resources.
Management of secret authentication information of users. Audit accounts that don't have multi-factor authentication enabled.
Review of user access rights. Audit accounts that should be prioritized for review, including depreciated accounts and external accounts with elevated permissions.
Removal or adjustment of access rights. Audit deprecated accounts with owner permissions on a subscription.
Secure log-on procedures. Audit accounts that don't have multi-factor authentication enabled.
Password management system. Enforce strong passwords.
Policy on the use of cryptographic controls. Enforce specific cryptographic controls and audit use of weak cryptographic settings.
Event and operator logging. Diagnostic logs provide insight into operations that were performed within Azure resources.
Administrator and operator logs. Ensure system events are logged.
Management of technical vulnerabilities. Monitor missing system updates, operating system vulnerabilities, SQL vulnerabilities, and virtual machine vulnerabilities in Azure Security Center.
Network controls. Manage and control networks and monitor network security groups with permissive rules.
Information transfer policies and procedures. Ensure information transfer with Azure services is secure.

We are committed to helping our customers leverage Azure in a secure and compliant manner. Over the next few months we will release new built-in blueprints for HITRUST, UK National Health Service (NHS) Information Governance (IG) Toolkit, FedRAMP, and Center for Internet Security (CIS) Benchmark. If you would like to participate in any early previews please sign up with this form, or if you have a suggestion for a compliance blueprint, please share it via the Azure Governance Feedback Forum.

Learn more about the Azure PCI-DSS v3.2.1 blueprint in our documentation.
Quelle: Azure