The Alt-Right Has Its Own Comedy TV Show On A Time Warner Network

The Alt-Right Has Its Own Comedy TV Show On A Time Warner Network

@Night_0f_Fire is in almost every way a quintessential alt-right Twitter user. He supports Donald Trump. He hates Hillary Clinton and questions her health. He retweets the full spectrum of the movement&;s icons, from macho culture warriors like Mike Cernovich and Pax Dickinson to conspiracy mongers like Alex Jones to overt racists. He calls Lena Dunham a “fat pig,” cheers the demise of Larry Wilmore&039;s Nightly Show, and bemoans the presence of “burkhas in video games.” He mocks Black Lives Matter. His tweets are fully in line with the wildly prolific online movement that has spawned Milo Yiannopoulos, triple parentheses to demarcate Jews, and the term “cuckservative.”

In fact, there&039;s really only one thing that separates @Night_0f_Fire — real name Sam Hyde — from the many other members of the angry, pro-Trump internet movement that grew out of Gamergate into a force capable of roiling American popular and political culture: Hyde has his own television show on Cartoon Network.

Million Dollar Extreme Presents: World Peace airs every Saturday at 12:15 a.m. on Adult Swim, the 8 p.m. to 6 a.m. incarnation of Cartoon Network famous for its stoner-y animation and sketch comedy. World Peace, which will air its fourth episode this weekend, is the latter. It&039;s the first wide exposure for MDE, a comedy group comprising Hyde and two collaborators that has gained a cult following on the internet and a reputation for being hugely offensive.

Promotional material for World Peace winks at Hyde&039;s alt-right fans. “Celebrate Diversity Every Friday at 12:15A ET,” reads the tagline on the Adult Swim website. Press copy announcing the show promised that “World Peace will unlock your closeted bigoted imagination, toss your inherent racism into the burning trash, and cleanse your intolerant spirit with pure unapologetic American funny_com.” Though none of the three episodes that have aired so far have touched on politics or the alt-right, they have hardly been in good taste. The most recent episode of the show opened with Hyde, in blackface, speaking in exaggerated black vernacular for three minutes.

According to Showbuzz Daily, World Peace ranked number two among original cable shows on the night of its premiere, with more than a million viewers.

Adult Swim

The alt-right — which will attain its greatest notoriety yet when Hillary Clinton gives a speech today denouncing it — has noticed the show. On Twitter, a steady stream of pro-Trump troll accounts have anointed World Peace “the only non cucked TV show” and “redpilled TV” that “will save the west.” A subreddit devoted to the show, moderated by someone claiming to be Hyde, describes itself with the ubiquitous Trump hashtag as “the Best Damn Internet Community™ on God&039;s green earth. .” My Posting Career, the 4chan-meets-far-right-politics forum that helped coin “cuckservative,” is running a special “Faggot alert” at the top of the page alerting readers that World Peace airs every Friday:

Reached via phone, Hyde attributed all of the tweets and Reddit posts to “his assistant.” Asked if he was a member of the alt-right, Hyde responded with a question: “Is that some sort of indie book store?”

Turner, which owns Cartoon Network and Adult Swim, responded to a request for comment by forwarding a written statement from an Adult Swim spokesperson:

“Adult Swim’s reputation and success with its audience has always been based on strong and unique comedic voices. Million Dollar Extreme’s comedy is known for being provocative with commentary on societal tropes, and though not a show for everyone, the company serves a multitude of audiences and supports the mission that is specific to Adult Swim and its fans.”

For the Carnegie Mellon and RISD–educated Hyde, World Peace is the latest act in a years-long career of making people uncomfortable. Though MDE has been publishing videos since at least 2009 (an early one is titled “old faggot”), Hyde is probably most famous for a 2013 stunt in which he hijacked a TEDx symposium in Philadelphia and gave a nonsensical presentation called “2070 Paradigm Shift,” to polite applause. More often — and surely the major reason for his popularity among the alt-right — he exploits, sometimes cruelly, cultural sensitivities around race, gender, and sexual orientation.

At a 2013 comedy event in Brooklyn, he performed a shocking set, a recording of which became a minor viral hit titled “Privileged White Male Triggers Oppressed Victims, Ban This Video Now and Block Him.” Hyde began by mocking the “hipster faggot” audience — at which point a few onlookers immediately left — then removed a piece of paper from his back pocket and proceeded to read 15 minutes of anti-gay pseudo science (“homosexuality is the manifestation of intense perversion and antisocial attitudes”) and outright hate speech (“next time you see a crazy gay person maybe it&039;s not because they were bullied, maybe it&039;s not because of homophobia … maybe it&039;s just because of their faggot brain that&039;s all fucked up”). He concluded by blaming positive portrayals of gay people on television on the “ZOG (Zionist Occupation Government) media machine destroying the family.” At the end of the set, he went outside to argue with some of the people who had left.

Last year, BuzzFeed News reported that a gun- and knife-brandishing internet personality named Jace Connors — who became notorious for claiming to crash his car while en route to the home of Brianna Wu, one of the most public victims of Gamergate – was actually the work of a member of MDE named Jan Rankowski, who created the Connors “character” with input from Hyde.

And last fall, Hyde and fellow MDE member Charls Carroll showed up near the Yale campus in New Haven bearing signs reading “All Lives Matter” and “No More Dead Black Children,” then proceeded to film a highly uncomfortable 15-minute video called “Yale Lives Matter” in which Hyde, among other things, lectures a black Apple store security guard that he is “playing a part in an oppressive system,” harangues the black employee of a preppy clothes store for selling “slave owner clothes,” and asks two young white men if they “killed any minorities today.”

This year, Hyde — or his assistant — seems to have decided to cast his lot in with the alt-right.Though Hyde has deleted all his tweets from before the new year, since then he&039;s been remarkably consistent in engaging with the major concerns of and personalities in the movement.

The alt-right, which idealizes offensive speech as a principled transgression against a censorious liberal culture, is a natural fit for MDE&039;s comedy, which combines nerdy references to anime and video games with the sinister goofiness of Tim and Eric, the anti-PC mean streak of pre-corporate Vice, and the terminal irony of meme culture. Indeed, MDE and Hyde specifically have been beloved on 4chan, one of the alt-right&039;s incubators, for years.

If Hyde isn&039;t quite of shitlord culture, he most certainly plays along. Earlier this year, Hyde became the possibly witting subject of a series of 4chan-perpetrated hoaxes that named him as the suspect in a series of mass shootings. A first cut of World Peace, aired online as part of an Adult Swim series called Development Meeting, featured a logo that fans quickly figured out was a copy of a symbol that Aurora shooter James Holmes scribbled in his notebooks. (It was cut from the actual broadcast.)

All of which raises the feeling that World Peace is one massive in-joke, designed to signify to a group of people online for whom the limits of irony have been misplaced and forgotten; identity content for the worst trolls in the world. After being revealed as the Jace Connors character, Jan Rankowski told BuzzFeed News that the videos had been a satire about “over-the-top, super-hyper-macho armed Gamergater.”

It&039;s a trap just to read Sam Hyde literally — he&039;s built a career out of making fun of people who take his speech too seriously. But that has not stopped Hyde&039;s alt-right admirers from trying to divine his true politics, in the same way they scan his show for secret messages. The closest they&039;ve come is a post by Hyde — or his assistant — on the MDE subreddit from late last year in which he – or his assistant — describes himself as basically a libertarian who believes that “we&039;re putting Western Civ on the alter [sic] as a sacrifice to white guilt because we&039;re worried some frizzy-haired Afro transsexual will wag his finger at us” and that “whites need to regain some sort of cohesive tribal self-interest and identity right now just like everybody else has.”


Whether or not this is genuine is basically unknowable, as Hyde never publicly breaks character. (Though he did add over the phone that “my assistant does a good job.”) It&039;s also beside the point for everyone except the converted — including executives at Adult Swim. Because it&039;s also a trap to not see the seriousness of what Hyde and co. are doing, even if they&039;re LOLiarng along the way and being disingenuous. Indeed, when the consequences of a culture involve the serial harassment and illegal publication of explicit photographs of a black actress because she had the temerity to stick up for herself, does it really matter whether the people having a laugh over it are in character?

Quelle: <a href="The Alt-Right Has Its Own Comedy TV Show On A Time Warner Network“>BuzzFeed

First look at VMware vCloud provider in ManageIQ / CloudForms

With VMworld 2016 US event just around the corner, we thought it would be a good time to look at some of the new features introduced in the ManageIQ community related to our support for VMware.
ManageIQ is the open source project behind Red Hat CloudForms. The latest product features are implemented in the upstream community first, before eventually making it downstream into Red Hat CloudForms. This process is similar for all Red Hat products. For example, Fedora is the upstream project for Red Hat Enterprise Linux and follows the same upstream-first development model.
In this article, we look at the recent development of a vCloud provider in ManageIQ. VMware vCloud becomes the latest addition to the list of supported public cloud providers, joining Amazon Web Services, Microsoft Azure, Google Cloud Platform and OpenStack.

Those of you following ManageIQ might have noticed some recent pull requests on the topic under the providers/vmware/cloud label. This is the first iteration for the vCloud provider which supports authentication, inventory (including vApps), provisioning, power operations and eventing. The provider makes use of the vCloud API provided by vCloud Director (vCD). vCD is part of the VMware solutions for vCloud Air and vCloud Service Providers.
 
A demonstration of this new provider can be seen in ManageIQ Sprint 45. Here are some screenshots from the provider in action in ManageIQ.
 

New Cloud Provider Type: VMware vCloud
 

vCloud Cloud Provider Summary (listing Instances, Images and Orchestration Stacks)
 

Detailed list of Orchestration Stacks (vApps)
 

Detailed list of Images
 

Detailed list of Instances
 

Detailed view of an Instance, including Power Operations
 
This is a great start towards adding support for VMware vCloud in ManageIQ. We look forward to future contributions and hopefully for this provider to make it downstream into Red Hat CloudForms in a future release.
Come and meet Red Hat at the VMworld 2016 US event and learn about ManageIQ and Red Hat CloudForms which provides unified management for container, virtual, private, and public cloud infrastructures.
Quelle: CloudForms

Getting started with Google Cloud Client Libraries for .NET

Posted by Mete Atamel, Developer Advocate

Last week, we introduced new tools and client libraries for .NET developers to integrate with Google Cloud Platform, including Google Cloud Client Libraries for .NET, a set of new client libraries that provide an idiomatic way for .NET developers to interact with GCP services. In this post, we’ll explain what it takes to install the new client libraries for .NET in your project.

Currently, the new client libraries support a subset of GCP services, including Google BigQuery, Google Cloud Pub/Sub and Google Cloud Storage (for other services, you still need to rely on the older Google API Client Libraries for .NET). Both sets of libraries can coexist in your project and as more services are supported by the new libraries, dependencies on the older libraries will diminish.

Authentication
As you would expect, the new client libraries are published on NuGet, the popular package manager for .NET, so it’s very easy to include them in your project. But before you can use them, you’ll need to set up authentication.

The GitHub page for the libraries (google-cloud-dotnet) describes the process for each different scenario in the authentication section. Briefly, to authenticate for local development and testing, install Cloud SDK for Windows, which comes with Google Cloud SDK shell, and use the gcloud command line tool to authenticate.

If you haven’t initialized gcloud yet, run the following command in Google Cloud SDK shell to initialize your project, zones and also setup authentication along the way:

$ gcloud init

If you’ve already set up gcloud and simply want to authenticate, run this command instead:

$ gcloud auth login

Installation

Now, let’s import and use the new libraries. Create a project in Visual Studio (but make sure it’s not a .NET Core project, as those are not supported by the libraries yet), right click on the project references and select “Manage NuGet packages”:

In NuGet window, select “Browse” and also check “Include prerelease.” The full list of supported services and their NuGet package names can be found on the google-cloud-dotnet page. Let’s install the library for Cloud Storage. For Cloud Storage, we need to search for Google.Storage:

The resulting list shows the new client library for Cloud Storage (Google.Storage) along with the low-level library (Google.Apis.Storage) that it depends on. Select Google.Storage and install it. When installation is complete, you’ll see Google.Storage as a reference, along with its Google.Apis dependencies:

That’s it! Now, you can use the new client library for Cloud Storage from your .NET application. If you’re looking for a sample, check out the Cloud Storage section of the GitHub page for the libraries.

Give it a try and let us know what you think. Any issues? Report them here. Better yet, help us improve our support for .NET applications by contributing.

“Interested in helping us improve the Google Cloud User Experience? Click here!”
Quelle: Google Cloud Platform

JSON support is generally available in Azure SQL Database

We are happy to announce that you can now query and store both relational and textual data formatted in JavaScript Object Notation (JSON) using Azure SQL Database. Azure SQL Database provides simple built-in functions that read data from JSON text, transform JSON text into table, and format data from SQL tables as JSON.

You can use JSON functions that enable you to extract value from JSON text (JSON_VALUE), extract object from JSON (JSON_QUERY), update some value in JSON text (JSON_MODIFY), and verify that JSON text is properly formatted (ISJSON). OPENJSON function enables you to convert JSON text into a table structure. Finally, JSON functionalities enable you to easily format results of any SQL query as JSON text using the FOR JSON clause.

What can you do with JSON?

JSON in Azure SQL Database enables you to build and exchange data with modern web, mobile, and HTM5/JavaScript single-page applications, NoSql stores such as Azure DocumentDB that contain data formatted as JSON, and to analyze logs and messages collected from different systems and services. Now you can easily integrate your Azure SQL Database with any service that uses JSON.

Easily expose your data to modern frameworks and services

Do you use services that exchange data in JSON format, such as REST services or Azure App Services? Do you have components or frameworks that use JSON, such as Angular JS, ReactJS, D3, or JQuery? With new JSON functionalities, you can easily format data stored in Azure SQL Database as JSON and expose it to any modern service or application.

Easy ingestion of JSON data

Are you working with mobile devices or sensors, services that produce JSON such as Azure Stream Analytics or Application Insight, or systems that store data in JSON format such as Azure DocumentDB or MongoDB? Do you need to query and analyze JSON data using well-known SQL language or tools that work with Azure SQL Database? Now, you can easily ingest JSON data and store it into Azure SQL Database, and use any language or tool that works with Azure SQL Database to query and analyze loaded information.

Simplify your data models

Do you need to store and query both relational and semi-structured data in your database? Do you need to simplify your data models like in NoSQL data platforms? Now you can combine structured relational data with schema-less data stored as JSON text in the same table. In Azure SQL Database you can use the best approaches both from relational and NoSQL worlds to tune your data model. Azure SQL Database enables you to query both relational and JSON data with the standard Transact-SQL language. Applications and tools would not see any difference between values taken from table columns and the values extracted from JSON text.

Next steps

To learn how to integrate JSON in your application, check out our Getting Started page or Channel 9 video. To learn about various scenarios that show how to integrate JSON in your application, see demos in this Channel 9 video or find some scenario that might be interesting for your use case in these JSON Blog posts.

Stay tuned because we will constantly add new JSON features and make JSON support even better.
Quelle: Azure

Networking for a hybrid application environment

Supporting a network in transition: Q&A blog post series with David Lef

In a series of blog posts, this being the fourth and last, David Lef, Principal Network Architect at Microsoft IT, chats with us about supporting a network as it transitions from a traditional infrastructure to a fully wireless platform. Microsoft IT is responsible for supporting 900 locations and 220,000 users around the world. David is helping to define the evolution of the network topology to a cloud-based model in Azure that supports changing customer demands and modern application designs.

David Lef identifies the considerations and challenges of supporting a network environment as line of business applications move from an on-premises environment to the cloud using Azure.

Q: Can you explain your role and the environment you support?

A: My role at Microsoft is principal network architect with Microsoft IT. My team supports almost 900 sites around the world and the networking components that connect those sites, which are used by a combination of over 220,000 Microsoft employees and vendors that work on our behalf. Our network supports over 2,500 individual applications and business processes. We’re responsible for providing wired, wireless, and remote network access for the organization, implementing network security across our network (including our network edges), and connectivity to Microsoft Azure in the cloud. We support a large Azure tenancy using a single Azure Active Directory tenancy that syncs with our internal Windows Server Active Directory forests. We have several connections from our on-premises datacenters to Azure using ExpressRoute. Our Azure tenancy supports a huge breadth of Azure resources, some of which are public-facing and some that are hosted as apps and services internal to Microsoft, but hosted on the Azure platform. We currently host forty percent of our LOB apps using Azure, and that number is continually increasing.

Q: Can you give me a quick history of how this migration has happened and how the network teams have supported it?

A: It started with Azure platform as a service (PaaS), which we were using for internal and external app and service solutions. PaaS was the first primary component that was offered on Azure, so we naturally started there when developing solutions. Most of our early applications were hosted completely in Azure. The hybrid scenario wasn’t fully developed or supported, so we didn’t implement that in our solutions.

However, as Azure networking and infrastructure as a service (IaaS) components have been introduced and matured, we’ve had much more flexibility in how we implement Azure-based solutions and how those solutions interact with each other and our on-premises infrastructure.

We adopted a strategy for Azure migration that addressed the most logical migration scenarios first: basic web apps, any new solutions, and any solutions that were targeted for redesign. Next, we looked at some more challenging apps, such as those with large bandwidth or resource usage requirements, or those that had regulatory implications or significantly affected business-critical operations. Finally, the most difficult and costly apps are left, such as customized legacy solutions with code that is difficult to update.

The main enablers for hybrid line of business (LOB) apps were the addition of ExpressRoute, which gives any Azure tenant the ability to obtain a private, dedicated connection to Azure from their datacenter, and the maturation of Azure IaaS, which has enabled us to take on-premises infrastructure and migrate it directly to Azure-hosted virtual machines without having to modify the components of the infrastructure. We have also widely adopted software as a solution (SaaS) solutions, such as Office 365.

Our primary support channel has been to facilitate the connectivity required by hybrid solutions. A lot of the connectivity is on the back end through ExpressRoute and the configuration and management required to connect our datacenters to Azure, but we also manage networking within Azure. There are some important security and compliance considerations when cloud and on-premises mix, and we’ve been diligent in ensuring that our data and infrastructure in the cloud is as secure as it is in our datacenters. Most of our Azure apps and services are available from some Internet touch-point, so it’s important for us to delineate between front end and back end within our solutions. We don’t want our infrastructure living in IaaS exposed to the public Internet.

Q: Have the challenges changed through this journey?

A: They’re always changing! Change is the nature of the cloud, and that’s really the first challenge we faced: understanding that solutions, processes, and methods are fluid and changing in Azure. There is a continuous stream of features being offered and changed; some of them can be inconsequential to a solution, while others can be game changers.

We understand that there are many ways to connect to Azure, from the datacenter perspective and from the user perspective. As much as possible, we try to provide the freedom to allow our application owners in Azure the choice in how they connect their apps to the datacenter and to their customer.

As an organization, we’ve had to learn how to modify our strategies to focus on the cloud first. Hosting LOB apps in Azure is a fundamental change in how the apps are used and supported. We’ve been diligent in keeping support and communication channels open with our application owners and our customers, and it’s critical that they understand the nature of the cloud and what that means to them and their business. Our cloud-first, mobile-first strategy means that everything is directed to Azure first, and it’s a cultural change that has been spearheaded by our CEO, Satya Nadella, and cascaded down through the rest of the organization. There is an excellent article that highlights the continuing evolution of our cloud strategy.

From a technical perspective, we have a lot of tools and processes in place to help us manage our Azure resources in a way that matches the fluidity of the environment. The integration of Azure Resource Manager (ARM) has been critical to centralizing and standardizing solution management and configuration within Azure. Resource groups and ARM templates provide the functionality we need to configure things properly and ensure they stay configured properly through changes to app requirements or Azure functionality. It’s a constant challenge to maintain both the datacenter and Azure concurrently, both technically and logistically. Migrations to Azure are a constant stream, so the configuration of our datacenters and Azure environments are in constant flux. We have and will have a long legacy tail that will be carried around for a while, so we have to ensure that we’re providing and managing communications between Azure and our on-premises environments as proactively as possible, and educating our tenants and users on how to use both effectively.

Learn more

Other blog posts in this series:

Supporting network architecture that enables modern work styles
Engineering the move to cloud-based services
Cloud-first, mobile-first: Microsoft moves to a fully wireless network

Learn more about how Microsoft IT is evolving its network architecture here.
Quelle: Azure

Why enterprises trust Azure with their apps and data

It takes a lot to earn the trust of enterprise IT, and rightly so: software runs the operations of almost every business around the world. For Microsoft, earning your trust has been a multi-decade investment, not something we started after we got into the cloud business. Everything we’ve done to earn your trust over the years we have applied to Azure.

At Microsoft, security, privacy, and compliance considerations have been baked into the development process for a very long time – it’s core to our culture. The Secure Development Lifecycle, an open methodology which developers can use to help them build more secure software, was invented at Microsoft over a decade ago and has been adopted broadly, across industries: safer software helps everybody.

Of the nearly $12 billion Microsoft spent on research and development last year, $1 billion was focused on our cybersecurity efforts. Because so many individuals and businesses rely on Microsoft, we feel a great responsibility here; and we have a distinct vantage point. As Ann Johnson, VP of our enterprise cybersecurity team, writes, “Microsoft has a unique position in cybersecurity. Because of the massive scale of information that Microsoft processes, for example, billions of device updates and hundreds of billions of emails and authentications, we’re able to synthesize threat data far faster than your organization could ever do it alone.”

Microsoft’s Digital Crimes Unit (pictured) works with attorneys and law enforcement around the globe to catch digital criminals. They use sophisticated analytics and visualization tools, running in Azure, of course.

Thinking about regulatory compliance like HIPAA, PCI, FedRamp, and hundreds of other standards in other countries, Microsoft has more certifications than any other cloud provider, and is continually adding more. Check out my colleague Alice Rison’s frequent updates on the Azure blog.

We’re continually researching new technologies to further advance the state of the art in digital security and privacy. For example, with homomorphic encryption, it’s possible to perform operations on data while never decrypting it, and you can download open-source code from Microsoft Research to try it today. And Microsoft’s work in post-quantum cryptography helps ensure that security can be maintained even when, in the future, quantum computers are able to break the RSA cryptosystem, the standard today.

The point? Microsoft has your back.

Is it any wonder then that enterprises increasingly are turning to Azure? Two important studies, one from infrastructure security firm Hytrust and another from Cowen & Company, both show the majority of you are thinking about Azure. In particular, Cowen’s showed that seventy-three percent of you are expecting to adopt Azure in the next year to 18 months.

We appreciate your confidence! We’ll have a lot more to say about security, and a wide array of other topics, at Ignite on September 24-26 in Atlanta. If you can’t make it, be sure to save the date and watch online.
Quelle: Azure

Retrieve platform notification system error details with Azure Notification Hubs

We enabled Platform Notification System Feedback a while back to improve monitoring and debugging, where all channel error feedback from Platform Notification Systems associated with your hub is put in a storage blob for you to peruse. If you haven’t yet, I recommend checking out the feature and the simple sample we prepared. Many customers found this very useful, but wished for a way to see these feedback per message request to Notification Hubs.

We thought about it and it was a wonderful idea!

With our latest updates, we’ve added a new field PnsErrorDetailsUri into Per Message Telemetry – if you haven’t worked with Per Message Telemetry, you can read about it here. This means that, as part of Per Message Telemetry, we process per message feedback from Platform Notification Systems as we push notifications out, extract the errors, and put them in a blob whose uri is then presented. This makes these feedback much more targeted and useful, helping you detect any errors in your pushes.

Here is an overview of the differences between Platform Notification System Feedback and the PNS Error Details we added to Per Message Telemetry:

 
Platform Notification System Feedback
Per Message Telemetry’s PNS Error Details

Scope
Notification hub
Notification ID

Content
Expired channel and bad channel errors from PNS
Any errors from PNS

Both PNS Feedback and PNS Error Details are available for Standard Tier namespaces.

If you are using REST, the return of Per Message Telemetry will have an additional PnsErrorDetailsUri when you work with Api-Version 2016-07 or above. The errors can be any of the following:

Invalid PNS credentials
PNS unreachable
Bad channel
Expired channel
Wrong channel
PNS throttled
Invalid token
Wrong token
Dropped

Note that the error details are only fully available after the associated notification send operation is complete, and that you will get NULL for PnsErrorDetailsUri if there is no error.

If you are using our NuGet, simply add a few lines of code to extract the PnsErrorDetailUri from notification outcome details and its blob content.

// Get Notification ID from any send request
var outcome = await client.SendWindowsNativeNotificationAsync(winPayload.ToString(), tag);

// Get pns error detail uri once notification processing is complete
var feedbackUri = string.Empty;
var retryCount = 0;
while (retryCount++ < 6)
{   var result = client.GetNotificationOutcomeDetailsAsync(outcome.NotificationId).Result;   if (result.State != NotificationOutcomeState.Completed)   {   await Task.Delay(TimeSpan.FromSeconds(10));   }   else   {   feedbackUri = result.PnsErrorDetailsUri;   break;   }
}
if (!string.IsNullOrEmpty(feedbackUri))
{   Console.WriteLine("feedbackBlobUri: {0}", feedbackUri);   var feedbackFromBlob = ReadFeedbackFromBlob(new Uri(feedbackUri));   Console.WriteLine("Feedback from blob: {0}", feedbackFromBlob);
}

You can easily read the blob with the following call with the Azure Storage NuGet.

private static string ReadFeedbackFromBlob(Uri uri)
{
var currentBlock = new CloudAppendBlob(uri);
var stringbuilder = new StringBuilder();
using (var streamReader = new StreamReader(currentBlock.OpenRead()))
{
while (!streamReader.EndOfStream)
{
string currentFeedbackString = streamReader.ReadLine();
if (currentFeedbackString != null)
{
stringbuilder.AppendLine(currentFeedbackString);
}
}
}
return stringbuilder.ToString();
​}

We will be updating the Node.js SDK soon to enable this feature as well. Meanwhile, give it a try with our NuGet or REST APIs and let us know what you think!
Quelle: Azure

New AWS Public Data Set – Spacenet on AWS

As of today, high-resolution satellite imagery from the SpaceNet corpus is available as an AWS Public Data Set. The SpaceNet corpus includes approximately 1,900 square kilometers full-resolution 50cm imagery collected from DigitalGlobe’s WorldView-2 commercial satellite and includes 8-band multispectral data. The dataset also includes 220,594 building footprints derived from this imagery which can be used as training data for machine learning. The first Area of Interest (AOI) to released is of Rio De Janeiro, Brazil, and more AOIs will be made available in the future.
Quelle: aws.amazon.com