Announcing Azure Stream Analytics on edge devices (preview)

Today, we are announcing Azure Stream Analytics (ASA) on edge devices, a new feature of Azure Stream Analytics that enables customers to deploy analytical intelligence closer to the IoT devices and unlock the full value of the device-generated data.

Azure Stream Analytics on edge devices extends all the benefits of our unique streaming technology from the cloud down to devices. With ASA on edge devices, we are offering the power of our Complex Event Processing (CEP) solution on edge devices to easily develop and run real-time analytics on multiple streams of data. One of the key benefit of this feature is the seamless integration with the cloud: users can develop, test, and deploy their analytics from the cloud, using the same SQL-like language for both cloud and edge analytics jobs. Like in the cloud, this SQL language notably enables temporal-based joins, windowed aggregates, temporal filters, and other common operations such as aggregates, projections, and filters.  Users can also seamlessly integrate custom code in JavaScript for advanced scenarios.

Enabling new scenarios

Azure IoT Hub, a core Azure service that connects, monitors and updates IoT devices, has enabled customers to connect millions of devices to the cloud, and Azure Stream Analytics has enabled customers to easily deploy and scale analytical intelligence in the cloud for extracting actionable insights from the device-generated data. However, multiple IoT scenarios require real-time response, resiliency to intermittent connectivity, handling of large volumes of raw data, or pre-processing of data to ensure regulatory compliance. All of which could now be achieved by using ASA on edge device to deploy and operate analytical intelligence physically closer to the devices.

Hewlett Packard Enterprise (HPE) is an early preview partner who has demonstrated a working prototype of ASA on edge devices at Microsoft&;s booth at Hannover Messe (April 24 to 28, Hall 7, Stand C40). A result of close collaboration between Microsoft, HPE and the OPC Foundation, the prototype is based on Azure Stream Analytics, the HPE Edgeline EL1000 Converged Edge System, and the OPC Unified Architecture (OPC-UA), delivering real-time analysis, condition monitoring, and control. The HPE Edgeline EL1000 Converged Edge System integrates compute, storage, data capture, control and enterprise-class systems and device management built to thrive in hardened environments and handle shock, vibration and extreme temperatures.

ASA on edge devices is particularly interesting for Industrial IoT (IIoT) scenarios that require reacting to operational data with ultra-low latency. Systems such as manufacturing production lines or remote mining equipment need to analyze and act in real-time to the streams of incoming data, e.g. when anomalies are detected.

In offshore drilling, offshore windfarms, or ship transport scenarios, analytics need to run even when internet connectivity is intermittent. In these cases, ASA on edge devices can run reliably to summarize and monitor events, react to events locally, and leverage connection to the cloud when it becomes available.

In industrial IoT scenarios, the volume of data can be too large to be sent to the cloud directly due to limited bandwidth or bandwidth cost. For example, the data produced by jet engines (a typical number is that 1TB of data is collected during a flight) or manufacturing sensors (each sensor can produce 1MB/s to 10MB/s) may need to be filtered down, aggregated or processed directly on the device before sending it to the cloud. Examples of these processes include sending only events when values change instead of sending every event, averaging data on a time window, or using a user-defined function.

Until now, customers with such requirements had to build custom solutions, and manage them separately from their cloud applications. Now, customers can use Azure Stream Analytics to seamlessly develop and operate their stream analytics jobs both on edge devices and in the cloud.

How to use Azure Stream Analytics on edge devices?

Azure Stream Analytics on edge devices leverages the Azure IoT Gateway SDK to run on Windows and Linux operating systems, and supports a multitude of hardware as small as single-board computers, to full PCs, servers or dedicated field gateways devices. The IoT Gateway SDK provides connectors for different industry standard communication protocols such as OPC-UA, Modbus and MQTT and can be extended to support your own communication needs. Azure IoT Hub is used to provide secured bi-directional communications between gateways and the cloud.

Azure Stream Analytics on edge devices is available now in private preview. To request access to the private preview, click here.

You can also meet with our team at Hannover Messe, the world&039;s biggest industrial fair, which take place from April 24th to April 28th in Hannover, Germany. We are located at the Microsoft booth in the Advanced Analytics pod (Hall 7, Stand C40).
Quelle: Azure

Announcing new functionality to automatically provision devices to Azure IoT Hub

We’re announcing a great new service to Azure IoT Hub that allows customers to provision millions of devices in a secure and scalable manner. Azure IoT Hub Device Provisioning enables zero-touch provisioning to the right IoT hub without requiring human intervention, and is currently being used by early adopters to validate various solution deployment scenarios.

Provisioning is an important part of the lifecycle management of an IoT device, which enables seamless integration with an Azure IoT solution. Technically speaking, provisioning pairs devices with an IoT hub based on any number of characteristics such as:

Location of the device (geo-sharding)
Customer who bought the device (multitenancy)
Application in which the device is to be used (solution isolation)

The Azure IoT Hub Device Provisioning service is made even better thanks to some security standardization work called DICE and will support multiple types of hardware security modules such as TPM. In conjunction with this, we announced hardware partnerships with STMicro and Micron.

Without IoT Hub Device Provisioning, setting up and deploying a large number of devices to work with a cloud backend is hard and involves a lot of manual work. This is true today for Azure IoT Hub. While customers can create a lot of device identities within the hub at a time using bulk import, they still must individually place connection credentials on the devices themselves. It&;s hard, and today customers must build their own solution functionality to avoid the painful manual process. Our commitment to strong security best practices is partly to blame. IoT Hub requires each device to have a unique identity registered to the hub in order to enable per-device access revocation in case the device is compromised. This is a security best-practice, but like many security-related best practices, it tends to slow down deployment.
 
Not only that, but registering a device to Azure IoT Hub is really only half the battle. Once a device is registered, physically deployed in the field, and hooked up to the device management dashboard, now customers have to configure the device with the proper desired twin state and firmware version. This extra step is more time that the device is not a fully-functioning member of the IoT solution. We can do better using the IoT Hub Device Provisioning service.

Hardcoding endpoints with credentials in mass production is operationally expensive, and on top of that the device manufacturer might not know how the device will be used or who the eventual device owner will be, or they may not care. In addition, complete provisioning may involve information that was not available when the device was manufactured, such as who purchased the device. The Azure IoT Hub Device Provisioning service contains all the information needed to provision a device.

Devices running Windows 10 IoT Core operating systems will enable an even easier way to connect to Device Provisioning via an in-box client that OEMs can include in the device unit. With Windows 10 IoT Core, customers can get a zero-touch provisioning experience, eliminating any configuration and provisioning hassles when onboarding new IoT devices that connect to Azure services. When combined with Windows 10 IoT Core support for Azure IoT Hub device management, the entire device life cycle management is simplified through features that enable device reprovisioning, ownership transfer, secure device management, and device end-of-life management. You can learn more about Windows IoT Core device provisioning and device management details by visiting Azure IoT Device Management.

Azure IoT is committed to offering our customers services which take the pain out of deploying and managing an IoT solution in a secure, reliable way. The Azure IoT Hub Device Provisioning service is currently in private preview, and we&039;ll make further announcements when it becomes available to the public. In the meantime, you can learn more about Azure IoT Hub&039;s device management capabilities. We would love to get your feedback on secure device registration, so please continue to submit your suggestions through the Azure IoT User Voice forum or join the Azure IoT Advisors Yammer group.

Learn more about Microsoft IoT

Microsoft is simplifying IoT so every business can digitally transform through IoT solutions that are more accessible and easier to implement. Microsoft has the most comprehensive IoT portfolio with a wide range of IoT offerings to meet organizations where they are on their IoT journey, including everything businesses need to get started, ranging from operating systems for their devices, cloud services to control them, advanced analytics to gain insights, and business applications to enable intelligent action. See how Microsoft IoT can transform your business.
Quelle: Azure

Are You Ready To Trust Facebook With Your Brain?

This is not Facebook&;s brain-computer interface

Yoshikazu Tsuno / AFP / Getty Images

In a 2015 Q&A, Facebook CEO Mark Zuckerberg argued that the future of communication may well be telepathy. “One day, I believe we&039;ll be able to send full rich thoughts to each other directly using technology,” he said. “You&039;ll just be able to think of something and your friends will immediately be able to experience it too.”

Now, two years later, Facebook is working hard to make Zuckerberg&039;s futurist vision a reality. And Regina Dugan — who headed up both the Defense Advanced Research Projects Agency (DARPA) and Google&039;s Advanced Technology and Projects group — is leading the effort. Onstage at Facebook&039;s F8 conference Wednesday, Dugan provided concrete details on the company’s telepathy efforts for the first time, introducing a research initiative she hopes will someday enable us to type words into Facebook posts simply by thinking them.

“We have a goal of creating a system capable of typing 100 words per minute … straight from your brain.”

“It sounds impossible, but it’s closer than you may realize,” Dugan said. “We have a goal of creating a system capable of typing 100 words per minute — five times faster than you can type on your smartphone — straight from your brain.”

For Dugan, the idea of such a brain-computer interface is not nearly far-fetched as it might sound. More to the point, it makes good sense. Thinking our words into a computer would likely be more efficient than manually typing them. And thinking commands into our smartphones could free us from staring at them so much, giving us more time to engage with the world around us. It too could make glasses overlaying digital information on the real world feel natural, since we wouldn’t have to operate them clunkily via touch or voice. “Even something as simple as a &039;yes,&039; &039;no&039; brain click would fundamentally change our capability,” Dugan said.

It remains to be seen if Facebook can develop such a technology and widely deploy it. But should it manage to do so it’ll likely encounter a perhaps more monumental task: convincing people to trust a company with a bumpy history of privacy missteps with their brains.

Well aware of such concerns, Dugan stressed that Facebook is taking a measured approach to these new brain-computer interfaces. “To be clear, we are not talking about decoding your random thoughts,” she said. “We’re talking about decoding those words, the ones you already decided to share, by sending them to the speech center of your brain.”

“To be clear, we are not talking about decoding your random thoughts.”

Dugan’s emphasis on thought-typing’s tie to voluntary human decisions could be the foundation of Facebook’s “trust us” pitch for such technology. You typically don’t move your arm unless you want to do so; you typically don’t speak without intending to say something; presumably you won’t think things into Facebook unless you choose to do so.

There will be hardware involved, of course. Dugan told BuzzFeed News that thought-typing will likely be done via an electronic headband or augmented reality glasses — or perhaps something the team hasn’t thought of yet. Asked if she thought people might be hesitant to wear such a Facebook-developed device, Dugan parried the question. “I want to be careful not to dial forward to a whole slew of potential hypotheses of what might and might not occur,” she said.

But Dugan did stress that Facebook isn’t blindly pursuing technological advancement without considering the implications. “We ask questions about technological progress and we always will,” she said. ”I feel optimistic about technological progress; I also feel responsible for doing the right things.”

Quelle: <a href="Are You Ready To Trust Facebook With Your Brain?“>BuzzFeed

Introducing H2O.ai on Azure HDInsight

We are excited to announce that H2O’s AI platform is now available on Azure HDInsight Application Platform. Users can now use H2O.ai’s open source solutions on Azure HDInsight, which allows reliable open source analytics with an industry-leading SLA.

To learn more about H2O integration with HDInsight, register for the webinar held by H2O and Azure HDInsight team.

HDInsight and H2O to make data science on big data easier

Azure HDInsight is the only fully-managed cloud Hadoop offering that provides optimized open source analytical clusters for Spark, Hive, MapReduce, HBase, Storm, Kafka, and R Server backed by a 99.9% SLA. Each of these big data technologies and ISV applications, such as H2O, are easily deployable as managed clusters with enterprise-level security and monitoring.

The ecosystem of data science has grown rapidly in the last a few years, and H2O’s AI platform provides open source machine learning framework that works with Spark sparklyr and PySpark. H2O’s Sparkling Water allows users to combine the fast, scalable machine learning algorithms of H2O with the capabilities of Spark. With Sparkling Water, users can drive computation from Scala/R/Python and utilize the H2O Flow UI, providing an ideal machine learning platform for application developers.

Setting up an environment to perform advanced analytics on top of big data is hard, but with H2O Artificial Intelligence for HDInsight, customers can get started with just a few clicks. This solution will install Sparkling Water on an HDInsight Spark cluster so you can exploit all the benefits from both Spark and H2O. The solution can access data from Azure Blob storage and/or Azure Data Lake Store in addition to all the standard data sources that H2O support. It also provides Jupyter Notebooks with in-built examples for an easy jumpstart, and a user-friendly H2O FLOW UI to monitor and debug the applications.

Getting started

With the industry leading Azure cloud platform, getting started with H2O on HDInsight is super easy with just a few clicks. Customer can install H2O during the creation of a new HDInsight cluster by simply selecting the customer applications when creating a cluster, selecting “H2O Artificial Intelligence for HDInsight”, and agreeing to the license terms.

Customers can also deploy H2O when on an existing HDInsight Spark cluster by clicking the “Application” link:

Sparkling Water integrates H2O&;s fast scalable machine learning engine with Spark. It provides utilities to publish Spark data structures (RDDs, DataFrames) as H2O&039;s frames and vice versa. Python interface enabling use of Sparkling Water directly from pySpark and many others. The architecture for H2O on HDInsight is as below:

After installing H2O on HDInsight, you can simply use Jupyter notebooks, which is built-in to Spark clusters, to write your first H2O on HDInsight applications. You can simply open the Jupyter Notebook, and will see a folder named “H2O-PySparkling-Examples”, which has a few getting started examples.

H2O Flow is an interactive web-based computational user interface where you can combine code execution, text, mathematics, plots, and rich media into a single document. It provides richer visualization experience for the machine learning models, and provides native support for hyper-parameter tuning, ROC Curve, etc.

Together with this combined offering of H2O on HDInsight, customers can easily build data science solutions and run them at enterprise grade and scale. Azure HDInsight provides the tools for a user to create a Data Science environment with underlying big data frameworks like Hadoop and Spark, while H2O’s technology brings a set of sophisticated, fully distributed algorithms to rapidly build and deploy highly accurate models at scale.

H2O.ai is now available on the Microsoft Azure marketplace and in HDInsight application. For more technical details, please refer to H2O documentation and this technical blog post on HDInsight blog.

Resources

H2O press release
Learn more about Azure HDInsight
Learn more about H2O
H2O on Azure marketplace
Getting Started with H2O on HDInsight
Use H2O with Azure HDInsight

Summary

We are pleased to announce the expansion of HDInsight Application Platform to include H2O.ai. By deploying H2O on HDInsight, customers can easily build analytical solutions and run them at enterprise grade and scale.
Quelle: Azure

Alex Jones And The Dark New Media Are On Trial In Texas

Infowars / Getty

AUSTIN, Tx — Halfway through the second official day of his 10-day civil custody trial, Alex Jones reclined in his chair and mopped sweat from his brow while watching a shirtless, pantsless version of himself hawk male vitality supplements on a courtroom television screen. It was hardly the most outlandish moment of the afternoon.

Indeed, the first full day of Jones’ battle to retain custody of his three young children was filled with bizarre allegations — claims that Jones took his shirt off during a joint family counseling session and once blamed his inability to recall basic facts about his children during a pre-trial deposition on having “had a big bowl of chili for lunch.”

The news from the Travis County courtroom — breathless tweets from a gaggle of journalists covering the trial — bled across the internet instantly. Since Sunday evening, when the Austin American Statesman broke the news that Jones’ attorneys planned to defend his custody on the grounds that his two-plus decades of conspiracy theorizing has been “performance art,” Alex Jones&; name and reputation have unexpectedly become one of the biggest stories in the country.

And while it’s unusual for a contentious family custody case to end up as fodder for late night television hosts (the Jones case got the extended Colbert monologue treatment on Monday evening), Jones’ trial is far larger than his painful and in some ways ordinary family dispute. For the millions on either side who both adore and revile Jones, the case offers the hope of answering a near-impossible question: where does Alex Jones the character end and Alex Jones the person begin?

But the herculean task of untangling Jones from his political views has put the 43 year-old broadcaster at the center of something bigger than himself. Unexpectedly, Jones is now the star of a courtroom drama that feels less like a quotidian family law case and more like a referendum on politics, the internet, and the media in the post-Trump ecosystem.

And that’s because at present Jones and his Infowars media empire sit at the intersection of the thorniest issues across the media landscape. Jones, depending on who you ask, is either a participant in, defender of, or the driving force behind everything from fake news, online harassment and conspiracy theories to the toxic, hyper-partisan politicization of seemingly innocuous events.

Which is what makes Jones’ trial — and his impending trip to the witness stand — so alluring. Perhaps less interesting than knowing exactly what Jones truly believes is the prospect of watching legal experts compel earnest testimony from one of the nation’s top exporters of loose facts, untruths, and partisanship. Jones’ unenviable position then — disavow your lucrative professional views or risk losing your family — feels like a rare shot at the truth at a time when disinformation and professionalized trolling are staples of both sides of the political spectrum.

And while Jones’ verdict will likely set few precedents when it comes to internet conspiracy theorizing, the national scrutiny is bound to imbue even the smallest rulings with added meaning. Jones’ performance art defense resonates deeply during a month in which CNN’s President Jeff Zucker prompted outrage by comparing political news coverage to a sporting event. Similarly, Judge Orlinda Naranjo’s decisions to admit or disallow Jones’ rants into evidence seem — perhaps unfairly — referendums on fake news. Mix in the trial’s custody element and the case’s rulings grow unanswerable and near-existential to an outside observer: can someone who trafficks in fake news simultaneously be a good father? Just how amoral are conspiratorial thoughts when they’re published for a wider audience?

And because Jones and the Infowars empire are creatures of the internet, the trial stands to put the engines and platforms of information distribution on trial in unexpected ways. On Tuesday there was confusion in the court as to whether Jones’ impromptu Facebook Live streaming videos — which depict him shouting at protesters and slurring words ahead of Donald Trump’s inauguration — should be classified as part of Jones’ professional life or if they were videos of a more personal nature, given that they weren’t shot in studio. There’s even a debate to be had over the legal sincerity of certain online threats. Jones’ parenting, for example, was called into question Tuesday by the opposing counsel for bringing his 14 year-old son onto his streamed radio after having received death threats online during his broadcasts.

And if you’re looking to understand how alternative political and factual universes respond to news about polarizing figures, the first two days of Jones’ trial have been highly instructive. Jones’ attorney’s performance art defense was met by the mainstream media as an ideological checkmate of sorts while his defenders reflexively blamed the claims on a deeper, more sinister conspiracy. Jones himself denied the rumors and claimed that the media was doing to him what they claim he does: take a kernel of truth and spin it to fit a convenient fake news narrative. Jones’ critics react incredulously while his fans argue it’s unfair to politicize what should be a private family matter. All sides talk past each other, ignoring the other and assuming they&039;ve won.

This local custody trial is not supposed to be about Alex Jones. And yet his centrality to the proceedings is unavoidable. Still, despite the fact that this is at its core a family matter and an examination of Jones’ dueling personae, it is hard to shake the feeling that there’s something greater looming over the 10 days. This is the 21st century media’s century’s Scopes Monkey Trial (we are the lower primates here, not the earnest schoolteacher), and the trial’s symbolic meaning will overshadow its subjects, litigants, and even its verdict. Instead, it standing in as a referendum on a divisive moment, to be interpreted differently by all who follow along. Trials with media personalities highlight this further. Last year’s Hulk Hogan lawsuit against Gawker Media was largely viewed as a condemnation of a bygone era of sensationalist online writing and reporting. And in its own overdetermined way, Jones’ trial, coverage, and fallout feels a bit like a trial for the media — with all its attendant volatility and uncertainty and toxicity — in the year 2017.

We are riveted to our olive green, cushioned seats in the gallery of the Travis County Courthouse this week because of Jones’ profound influence — both intentional and unintended — on our politics, culture, and on a conspiratorial ideology of fear that transcends party lines. The case of Jones v. Jones resonates so deeply at this moment because we are living in a moment that Alex Jones himself ushered in.

Quelle: <a href="Alex Jones And The Dark New Media Are On Trial In Texas“>BuzzFeed

Announcing Azure Analysis Services general availability

Today at the Data Amp event, we are announcing the general availability of Microsoft Azure Analysis Services, the latest addition to our data platform in the cloud. Based on the proven analytics engine in SQL Server Analysis Services, Azure Analysis Services is an enterprise grade OLAP engine and BI modeling platform, offered as a fully managed platform-as-a-service (PaaS). Azure Analysis Services enables developers and BI professionals to create BI Semantic Models that can power highly interactive and rich analytical experiences in BI tools and custom applications.

Why Azure Analysis Services?

The success of any modern data-driven organization requires that information is available at the fingertips of every business user, not just IT professionals and data scientists, to guide their day-to-day decisions. Self-service BI tools have made huge strides in making data accessible to business users. However, most business users don’t have the expertise or desire to do the heavy lifting that is typically required – finding the right sources of data, importing the raw data, transforming it into the right shape, and adding business logic and metrics – before they can explore the data to derive insights. With Azure Analysis Services, a BI professional can create a semantic model over the raw data and share it with business users so that all they need to do is connect to the model from any BI tool and immediately explore the data and gain insights. Azure Analysis Services uses a highly optimized in-memory engine to provide responses to user queries at the speed of thought.

Integrated with the Azure data platform

Azure Analysis Services is the latest addition to the Azure data platform. It integrates with many Azure data services enabling customers to build sophisticated analytics solutions.

Azure Analysis Services can consume data from Azure SQL Database and Azure SQL Data Warehouse. Customers can build enterprise data warehouse solutions in Azure using a hub-and-spoke model, with the SQL data warehouse at the center and multiple BI models around it targeting different business groups or subject areas.
With more and more customers adopting Azure Data Lake and HDInsight, Azure Analysis Services will soon offer the ability to build BI models on top of these big data platforms, enabling a similar hub-and-spoke model as with Azure SQL Data Warehouse.
In addition to the above, Azure Analysis Services can also consume data from on-premises data stores such as SQL Server, Oracle, and Teradata. We are working on adding support for several more data sources, both cloud and on-premises.
Azure Data Factory is a data integration service that orchestrates the movement and transformation of data, a core capability in any enterprise BI/analytics solution. Azure Analysis Services can be integrated into any Azure Data Factory pipeline by including an activity that loads data into the model. Azure Automation and Azure Functions can also be used for doing lightweight orchestration of models using custom code.
Power BI and Excel are industry leading data exploration and visualization tools for business users. Both can connect to Azure Analysis Services models and offer a rich interactive experience. In addition, third party BI tools such as Tableau are also supported.

How are customers using Azure Analysis Services?

Since we launched the public preview of Azure Analysis Services last October, thousands of developers have been using it to build BI solutions. We want to thank all our preview customers for trying out the product and giving us valuable feedback. Based on this feedback, we have made several quality, reliability, and performance improvements to the service. In addition, we introduced Scale Up & Down and Backup & Restore to allow customers to better manage their BI solutions. We also introduced the B1, B2, and S0 tiers to offer customers more pricing flexibility.

Following are some customers and partners that have built compelling BI solutions using Azure Analysis Services.

Milliman is one of the world&;s largest providers of actuarial and related products and services. They built a revolutionary, industry first, financial modeling product called Integrate, using Azure to run highly complex and mission critical computing tasks in the cloud.

“Once the complex data movement and transformation processing is complete, the resulting data is used to populate a BI semantic model within Azure Analysis Services, that is easy to use and understand. Power BI allows users to quickly create and share data through interactive dashboards and reports, providing a rich immersive experience for users to visualize and analyze data in one place, simply and intuitively. The combination of Power BI and Azure Analysis Services enables users of varying skills and backgrounds to be able to deliver to the ever-growing BI demands needed to run their business and collaborate on mission critical information on any device.”

Paul Maher, Principal and CTO, Milliman Life Technology Solutions

“Another great use case for Azure Analysis Services is leveraging its powerful modeling capabilities to bring together numerous disparate corporate data sources. An initiative at Milliman is currently in design leveraging various Finance data sets in order to create a broader scope and more granular access to critical business information. Providing a cohesive and simple-to-access data source for all levels of users gives the business leaders a new tool – whether they use Excel or Power BI for their business analytics.”

Andreas Braendle, CIO, Milliman

Contidis is a company in Angola that is building the new Candando supermarket chain. They created a comprehensive BI solution using Power BI and Azure Analysis Services to help their employees deliver better customer service, uncover fraud, spot inventory errors, and analyze the effectiveness of store promotions.

“Since we implemented our Power BI solution with Azure Analysis Services and Azure SQL Data Warehouse, we’ve realized a big improvement in business insight and efficiency. Our continued growth is due to many factors, and Power BI with Azure Analysis Services is one of them.”

Renato Correia, Head of IT and Innovation, Contidis

DevScope is a Microsoft worldwide partner who is helping customers build solutions using Azure Analysis Services.

“One of the great advantages of using Azure Analysis Services and Power BI is that it gives us the flexibility to start small and scale up only as fast as we need to, paying only for the services we use. We also have a very dynamic security model with Azure Analysis Services and Azure Active Directory and  in addition to providing row-level security, we use Analysis Services to monitor report usage and send automated alerts if someone accesses a report or data record that they shouldn’t."

Rui Romano, BI Team Manager, DevScope

Azure Analysis Services is now generally available in 14 regions across the globe: Southeast Asia, Australia Southeast, Brazil South, Canada Central, North Europe, West Europe, West India, Japan East, UK South, East US 2, North Central US, South Central US, West US, and West Central US. We will continue to add regions based on customer demand, including government and national clouds.

Please use the following resources to learn more about Azure Analysis Services, get your questions answered, and give us feedback and suggestions about the product.

Overview
Documentation
Pricing
MSDN forum
Ideas & suggestions

Join us at the Data Insights Summit (June 12-13, 2017) or at one of the user group meetings where you can hear directly from our engineers and product managers.
Quelle: Azure

Facebook’s New Camera Could Make It Even Harder To Tell What’s Real

Stephen Lam / Reuters

Onstage this morning at Facebook’s annual developer conference, Mark Zuckerberg used the image of an ordinary coffee cup — displayed on the gigantic screen above him — to demonstrate Facebook’s new in-app camera, which uses superior artificial intelligence to recognize objects and then seamlessly manipulate them. Facebook’s software will know it’s a mug — just tap on the coffee and a toolbar will pop-up with relevant effects like a cloud of steam. Or, said Zuckerberg, “you can add a second coffee mug, so it looks like you’re not having breakfast alone.”

Without naming his muse — Snapchat — Zuckerberg told the crowd of thousands that Facebook is ready to take augmented reality mainstream, to make it accessible to anyone with a smartphone. On stage, Zuckerberg ran through the primary use cases for Facebook’s camera, including adding digital objects, a la Pokemon Go, or the ability to “enhance digital objects like your home or your face.” Facebook CTO Mike Schroepfer, chief technology officer, offered a more seasonal example: “Let’s say I took a wonderful vacation photo and a windsurfer rudely interrupted my view.” With Facebook’s camera, the offending surfer could be easily edited out, Schroepfer explained, using a slide screen to show just how easy it was rewrite vacation history.

The examples sounded as innocuous as could be, until you considered how they might play out in the real world. In the keynote, Facebook floated right past questions like: Can Facebook’s camera erase a man on dry land from a photograph as easily as it can a windsurfer? Are there realistic-looking items can Facebook instantly insert into a photo? In other words, just how much will people be able to doctor the photos that appear in their feeds? And will the people who see them know they’ve been manipulated?

Facebook didn’t demonstrate this trick on stage, but during an earlier interview, the company showed BuzzFeed News how its radical camera could take an ordinary photo of a person and manipulate their facial expression to make the person smile, or frown, or display whatever other emotion the smartphone-holder desired. Back in December, the Verge warned that artificial intelligence was already making it easy to make fake images and fake videos, pointing to a startup called SmileVector that can make any celebrity smile.

To be clear, many of the effects available now — like breakfast sharks flying around Zuckerberg’s cereal bowl — are clearly cartoons. Facebook declined to comment on the record, but the company&;s Camera Effects Platform is still in closed beta: effects have to be submitted and reviewed by Facebook before being shared. Each effect also has to adhere to Facebook’s policies and terms governing what’s offensive or illegal. The company monitors how effects are being used and will update accordingly.

But soon enough, these tools will soon be distributed to nearly two billion users, with frictionless ease. And, as Zuckerberg said many times on stage, they’re still primitive. That&039;s an interesting posture for a company with a major fact-checking problem that has seen time and again the way its products can be used to foster hate speech, violence, and division. It&039;s worth noting that a recent report about Silicon Valley reengineering journalism, traced fake news opportunists back to Zuckerberg’s (seemingly benign-sounding) goal from 2014 to build a personalized paper.

We don’t know how Facebook Camera and the products built on it will be used in the real world until they&039;re, well, out in the real world, in the pockets of a billion-plus people — some of whom are assholes (or Macedonian teens). The people who use technology are, all too often, more creative than those who make it: They find new and ingenious ways to hack the algorithm, evade the censors, further their agendas, and make certain topics go viral. Facebook’s new camera effects could very well end up being an innocuous way to make breakfast fun and fix your vacation pictures — or it could mean we’ll soon be a nation divided over fake photos instead of fake news. In the meantime, that steam sure is cool.

Quelle: <a href="Facebook’s New Camera Could Make It Even Harder To Tell What’s Real“>BuzzFeed

We Tried Out Facebook's New Social VR App At F8

SAN JOSE — Facebook announced a new social virtual reality app for its Oculus Rift headset today at F8, the company&;s annual conference for software developers. It&039;s called Facebook Spaces, and you can download the beta version from the Oculus Store now.

In her keynote address, Rachel Rubin Franklin, Facebook&039;s head of social VR, said that Spaces signaled “the very beginnings of social VR.” People on Twitter said it looked a lot like Second Life and The Sims. Perhaps not coincidentally, Franklin previously worked as vice president at Electronic Arts managingThe Sims game.

Oculus Rift already supports the social game Altspace VR, made by an independent game publisher of the same name, where people can gather in virtual rooms via human or robot avatars and host events, play games, make art, watch 2D videos, or socialize.

Here&039;s how Spaces works:

First, you connect your Facebook account to Spaces in the “Devices Requests” tab of the Facebook mobile app. Then you&039;ll strap on your Rift headset and navigate to the Spaces app, which will appear in your library. You&039;ll need the Rift Touch controllers, which retail at $100 a pair, to use Facebook Spaces.

To create your virtual self, you choose from several versions of an avatar whose features are drawn from scans of your Facebook photos. You can customize some features, like hair, eye color, and glasses. The majority of the avatars seem to have large foreheads.

You can invite up to three friends to your space.

If they accept, you&039;ll find yourselves sitting around a virtual table. They&039;ll see your virtual avatar and the backdrop behind it. (You can choose from default backgrounds or use your own 360 pictures.) Your avatar can also video call friends via Messenger, which you can pull up as a flat menu in the virtual space. The video will appear as a 2D screen that you&039;ll be able to grab and move around. If they&039;re not in Facebook Spaces when you call, you&039;ll see their IRL face and surroundings, and they&039;ll see your avatar and your virtual setting.

If you get tired of your friends, Facebook included the ability to mute them or wholesale remove them from your virtual space. Gurl, bye.

Here&039;s a friend appearing&;

To entertain yourselves, you can draw in 3D and take selfies.

Here&039;s my virtual selfie with Christian, who works at Facebook.

I was really into the 3D marker. He was unamused.

And here&039;s BuzzFeed video producer Allyson Laquian&039;s selfie, where she has a piece of pizza in her head.

The background is a 360 video of the Pyramids of Giza in Egypt.

360 videos can totally change your environment

To play a 360 video, you&039;ll open a menu of options using the virtual control interface, and you select from content Facebook publishers have made or that you&039;ve recorded. When you grab a video from that menu, it&039;ll appear as a small orb in your virtual hand. To play it you can either put it in your avatar&039;s mouth or in the center of the communal table.

The videos will play all around you, turning your virtual environment into the video. You&039;ll also be able to view two-dimensional videos and pictures from Facebook publishers and your own timeline within Spaces as movable flat screens.

Facebook said in a statement that it&039;s hoping to bring the app to more platforms in the future but didn&039;t specify which ones.

Quelle: <a href="We Tried Out Facebook&039;s New Social VR App At F8“>BuzzFeed