After a bit of investigation, Neumann realized that Bing Maps’ data set essentially covered the entire planet. The only problem? It was all in 2D. After using some of that data to build a flyable 3D version of Seattle, Neumann turned to the Azure team to craft a machine learning method for converting the entire planet into a giant 3D model.
“AI has just tremendously grown in the last few years,” said Eric Boyd, CVP of Azure AI, in an interview. “It’s really driven by the massive amounts of data that are now available, combined with the massive amounts of compute that exist in the cloud … The results you can see are really pretty spectacular where you can come up with algorithms that now look at literally every square kilometer of the planet to identify the individual trees, grass and water, and then use that to build 3D models.”
Azure’s integration goes beyond the shape of the world. It also powers the flight controller voices using AI Speech Generation technology, which sound almost indistinguishable from humans. It’s so natural that many players may think Microsoft is relying solely on voice actors.
Since the company began exploring ways to bring Azure AI into the game in 2016, the capabilities of machine learning have also evolved dramatically, according to Boyd. ”The AI algorithm space has really grown in the last several years,” he said. “And so vision algorithms, which is what’s heavily used to identify all these different trees and buildings and classify them exactly, those have come a tremendous way.”
Since it leans so heavily on the cloud, Flight Simulator is a “living game” in the truest sense, Neumann said. All of the machine learning algorithms the game relies on will steadily improve over time, as the company irons out bugs and optimizes the engine. (And perhaps becomes more aware of potential issues, like the typo that created a 212-story tower in Melbourne.) But he points out the algorithms can only be as good as the source data, so Microsoft is working harder to refine that as well.
But if you’re among those who believe Facebook already knows too much about our lives, you’re probably more than slightly disturbed by the idea of Facebook having a semi-permanent presence on your actual face.
Facebook
Facebook, to its credit, is aware of this. The company published a lengthy blog post on all the ways it’s taking privacy into consideration. For example, it says workers who wear the glasses will be easily identifiable and will be trained in “appropriate use.” The company will also encrypt data and blur faces and license plates. It promises the data it collects “will not be used to inform the ads people see across Facebook’s apps,” and only approved researchers will be able to access it.
But none of that addresses how Facebook intends to use this data or what type of “research” it will be used for. Yes, it will further the social network’s understanding of augmented reality, but there’s a whole lot else that comes with that. As the digital rights organization Electronic Frontier Foundation (EFF) noted in a recent blog post, eye tracking alone has numerous implications beyond the core functions of an AR or VR headset. Our eyes can indicate how we’re thinking and feeling — not just what we’re looking at.
As the EFF’s Rory Mir and Katitza Rodriguez explained in the post:
How we move and interact with the world offers insight, by proxy, into how we think and feel at the moment. If aggregated, those in control of this biometric data may be able to identify patterns that let them more precisely predict (or cause) certain behavior and even emotions in the virtual world. It may allow companies to exploit users’ emotional vulnerabilities through strategies that are difficult for the user to perceive and resist. What makes the collection of this sort of biometric data particularly frightening, is that unlike a credit card or password, it is information about us we cannot change. Once collected, there is little users can do to mitigate the harm done by leaks or data being monetized with additional parties.
There’s also a more practical concern, according to Rodriguez and Mir. That’s “bystander privacy,” or the right to privacy in public. “I’m concerned that if the protections are not the right ones, with this technology, we can be building a surveillance society where users lose their privacy in public spaces,” Rodriguez, International Rights Director for EFF, told Engadget. “I think these companies are going to push for new changes in society of how we behave in public spaces. And they have to be much more transparent on that front.”
In a statement, a Facebook spokesperson said that “Project Aria is a research tool that will help us develop the safeguards, policies and even social norms necessary to govern the use of AR glasses and other future wearable devices.”
Facebook is far from the only company to grapple with these questions. Apple, also reportedly working on an AR headset, also seems to be experimenting with eye tracking. Amazon, on the other hand, has taken a different approach when it comes to the ability to understand our emotional state.
Consider its newest wearable: Halo. At first glance, the device, which is an actual product people will soon be able to use, seems much closer to the kinds of wrist-worn devices that are already widely available. It can check your heart rate and track your sleep. It also has one other feature you won’t find on your standard Fitbit or smartwatch: tone analysis.
Opt in and the wearable will passively listen to your voice throughout the day in order to “analyze the positivity and energy of your voice.” It’s supposed to aid in your overall well being, according to Amazon. The company suggests that the feature will “help customers understand how they sound to others,” and “support emotional and social well-being and help strengthen communication and relationships.”
Amazon
If that sounds vaguely dystopian, you’re not alone, the feature has already sparked more than oneBlack Mirror comparison. Also concerning: history has repeatedly taught us that these kinds of systems often end up being extremely biased, regardless of the creator’s intent. As Protocol points out, AI systems tend to be pretty bad at treating women and people of color the same way they treat white men. Amazon itself has struggled with this. A study last year from MIT’s Media lab found that Amazon’s facial recognition tech had a hard time accurately identifying the faces of dark-skinned women. And a 2019 Stanford study found racial disparities in Amazon’s speech recognition tech.
So while Amazon has said it uses diverse data to train its algorithms, it’s far from guaranteed that it will treat all its users the same in practice. But even if it did treat everyone fairly, giving Amazon a direct line into your emotional state could also have serious privacy implications.
And not just because it’s creepy for the world’s biggest retailer to know how you’re feeling at any given moment. There’s also the distinct possibility that Amazon could, one day, use these newfound insights to get you to buy more stuff. Just because there’s currently no link between Halo and Amazon’s retail service or Alexa, doesn’t mean that will always be the case. In fact, we know from patent filings Amazon has given the idea more than a passing thought.
The company was granted a patent two years ago that lays out in detail how Alexa may proactively recommend products based on how your voice sounds. The patent describes a system that would allow Amazon to detect “an abnormal physical or emotional condition” based on the sound of a voice. It could then suggest content, surface ads and recommend products based on the “abnormality.” Patent filings are not necessarily indicative of actual plans, but they do offer a window into how a company is thinking about a particular type of technology. And in Amazon’s case, its ideas for emotion detection are more than a little alarming.
An Amazon spokesperson told Engadget that “we do not use Amazon Halo health data for marketing, product recommendations, or advertising,” but declined to comment on future plans. The patent offers some potential clues, though.
Google Patents/Amazon
“A current physical and/or emotional condition of the user may facilitate the ability to provide highly targeted audio content, such as audio advertisements or promotions,” the patent states. “For example, certain content, such as content related to cough drops or flu medicine, may be targeted towards users who have sore throats.”
In another example — helpfully illustrated by Amazon — an Echo-like device recommends a chicken soup recipe when it hears a cough and a sniffle.
As unsettling as that sounds, Amazon makes clear that it’s not only taking the sound of your voice into account. The patent notes that it may also use your browsing and purchase history, “number of clicks,” and other metadata to target content. In other words: Amazon would use not just your perceived emotional state, but everything else it knows about you to target products and ads.
Which brings us back to Facebook. Whatever product Aria eventually becomes, it’s impossible now, in 2020, to fathom a version of this that won’t violate our privacy in new and inventive ways in order to feed into Facebook’s already disturbingly-precise ad machine.
Facebook’s mobile apps already vacuum up an astounding amount of data about where we go, what we buy and just about everything else we do on the internet. The company may have desensitized us enough at this point to take that for granted, but it’s worth considering how much more we’re willing to give away. What happens when Facebook knows not just where we go and who we see, but everything we look at?
A Facebook spokesperson said the company would “be up front about any plans related to ads.”
“Project Aria is a research effort and its purpose is to help us understand the hardware and software needed to build AR glasses – not to personalize ads. In the event any of this technology is integrated into a commercially available device in the future, we will be up front about any plans related to ads.”
A promise of transparency, however, is much different than an assurance of what will happen to our data. And it highlights why privacy legislation is so important — because without it, we have little alternative than to take a company’s word for it.
“Facebook is positioning itself to be the Android of AR VR,” Mir said. “I think because they’re in their infancy, it makes sense that they’re taking precautions to keep data separate from advertising and all these things. But the concern is, once they do control the medium or have an Android-level control of the market, at that point, how are we making sure that they’re sticking to good privacy practices?”
And the question of good privacy practices only becomes more urgent when you consider how much more data companies like Facebook and Amazon are poised to have access to. Products like Halo and research projects like Aria may be experimental for now, but that may not always be the case. And, in the absence of stronger regulations, there will be little preventing them from using these new insights about us to further their dominance.
“There are no federal privacy laws in the United States,” Rodriguez said. ”People rely on privacy policies, but privacy policies change over time.”
Just a few days after opening up Xbox remote play to Android users, Microsoft has confirmed it’s testing the feature on iOS devices as well. The Verge’s Tom Warren got a look at the new Xbox app in beta, and it works just like you’d expect: It connects directly to your Xbox One (or upcoming consoles), and lets you play anything that’s already on your system. To be clear, this is different from Microsoft’s xCloud service because it’s running off of your own console, something Sony is already doing with its Remote Play app.
So what does this mean for the rest of us? The new iOS Xbox app looks pretty stable, and Warren says that he expects it to arrive on the App Store soon. Microsoft isn’t being specific on availability either, but hopefully it arrives in time for Xbox Series X and S owners to play some games on the go.
Google will not run any election-related ads after polls for the US presidential election close on November 3rd, according to Axios. In an email obtained by the publication, the search giant warns advertisers they won’t be able to run ads “referencing candidates, the election or its outcome, given that an unprecedented amount of votes will be counted after election day this year.”
In the same email, Google says it will likewise ban ads that target people using election-related terms, including the names of specific candidates. Axios reports the policy applies to all of the platforms where the company runs advertisements, including YouTube. We’ve reached out to Google for comment, and we’ll update this article with the company’s response.
Elektron’s Analog Four MKII and Analog Rytm MKII are both serious high-end instruments. They’re $1,399 and $1,699 respectively. But, despite being at the top of the Elektron heap, they’ve been missing some of the big features that make its more affordable Digi- and Model: lines so exciting. But, with Analog Four OS 1.50 and Analog Rytm OS 1.60 both are finally adding step recording mode and trig probability. That gives them both the full sequencing power that Elektron devices enjoy. Now you can manually build out drum patterns or punch in chords even if you fingers are fast enough to play live. Probability also brings a dash of randomness so that things don’t get stale. You can also easily preview trigs in your sequence now, without having to hit play and listen through your whole pattern.
Both Analogs are also now class compliant USB audio sources. That means your don’t need Elektron’s Overbridge or a separate audio interface to connect them to your computer or mobile device. So now it’s much easier to get your glitchy drums off the Rytm and into your DAW of choice, whether that’s Ableton on a Windows PC or GarageBand on an iPhone.
So yeah, it would appear the convergence of Apple’s Mac and iPad software is well underway. There’s one more thing we need to talk about, though: the Apple Pencil. I’d be lying if I said I wasn’t bummed this iPad uses Apple’s first-gen stylus from five years ago, but at least iPadOS 14 gives you more ways to use it.
The most notable Pencil-focused addition is Scribble, which lets you just start writing in any text field. From there, iPadOS does its best to render your chicken scratch into machine-readable text. You don’t have to get your pen strokes smack in the middle of the field either! As long as you’re close, iPadOS will figure out where you actually meant to write and take it from there.
Engadget
Now, I’ll be the first to admit my penmanship ranges from pretty good to doctor-level illegible depending on how fast I’m going, but I’ve been surprised by how accurate the results have been. Of course, mistakes happen, and thankfully it’s easy enough to fix errors with a cluster of on-screen controls that appear at the bottom of the screen. This is my one quibble with Scribble: If you flub a URL or a Google search term, having to move your hand down to those controls can get you out of a groove pretty quickly. I know how minor this sounds, but if you’re like me and make back-to-back typos all the time, the back-and-forth gets old fast.
The smart move would’ve been for iPadOS to dynamically place that “palette” on screen depending on where the text field you’re writing is. I’m adding that to my wishlist for iPadOS 14.1. Still, if you’re the type of person who uses the Pencil frequently anyway, I can’t overstate how helpful Scribble is; it means you don’t have to put the Pencil down to use all your other software.
Speaking of other software, the Notes app has been revamped with a slew of new Pencil features. If you deal with diagrams frequently, Notes will “quantize” your doodled polygons, arrows, and hearts, turning them into geometrically precise figures. If marking up flowcharts isn’t your thing, you can double-tap anything you’ve written to select it — from there, you can select as much of your scrawl as needed and paste it as plain text, or just rearrange it on the page. Perhaps best of all, the Notes app is constantly processing what you write as soon as you write it, so it knows to treat some snippets differently than others.
Messenger probably isn’t the greatest option for your main messaging app, but being able to reset the default could let you choose apps that offer a better experience than Messenger or Apple’s Messages.
Android’s mobile OS already lets users choose their preferred messaging app. Sadly, Apple is probably not going to give users that choice. Apple’s Messages app is still one reason that people buy Apple hardware, and Apple uses the encrypted messages to brag about its privacy practices.
But not allowing users to choose their default messaging app could add to the argument that Apple practices “monopolist behaviors.” The company is facing increased criticism over its App Store fees, and it is the target of multiple antitrust investigations. Today, Epic Games, Spotify and others announced the Coalition for App Fairness, an alliance to pressure both Apple and Google to change their app store rules and other restrictive policies.
Henry also mentioned the S20’s build quality, saying it “didn’t feel as premium as past phones” and that it “would have been nice to get a proper black color” for the handset. Jun Jie was likewise disappointed with the colors on the Ultra: “You went from Aura-ish colors on the Note10+ to Cosmic Grey on the S20 Ultra that’s more dull than my future. Why?” And both Henry and Steve wanted a headphone jack on the S20 and S20 Ultra, respectively.
Screen
The screens on all three handsets hit big with users. Sneak said the S20’s display is amazing, Ryan found the screen on the S20+ beautiful, adding that he can use the 120Hz with no noticeable difference in resolution. However, he did say that the “screen glass is easily susceptible to scratching,” and that “after a month of careful use, there are three or four small scratches noticeable when the screen is off. The notion that Gorilla Glass is somehow impervious to scratching is clearly a myth.”
Cherlynn Low/Engadget
When it came to the 120Hz refresh rate on her S20+, Brianna was enthusiastic, saying she “loves the buttery smooth refresh rate” and that she “never knew I needed 120Hz in my life until I saw it in person! Never going back!” Charlie called the screen on the S20 Ultra beautiful, Jun Jie found it glorious and Steve admitted the large screen was one of his “killer apps” on the Ultra, but he skips using the 120Hz mode because it drains the battery.
Camera
There was very little negative feedback about the camera features of the S20 lineup. The S20 and S20+ both have a 3x optical zoom system, while the S20 Ultra boasts a 100x Space Zoom with a 4x optical zoom. Sneak liked the camera on their S20, but Nick was disappointed that his S20+ didn’t feature a real telephoto camera and will instead crop a 64MP frame.
Cherlynn Low/Engadget
S20 Ultra users were more detailed about their experiences. Derek called the camera cool, despite having to return his initial handset because of an issue with it. Steve said he “uses the Pro mode all the time and I love the level of control. I have used the 100x zoom, and while it’s not perfect, it’s better than not having the option at all.” And Charlie found the camera to be amazing, adding that “it has focus issues sometimes but I expect that to be fixed with software updates in the near future. The zoom capability is incredible and very helpful in my job.”
Battery
The battery life of the phones was only briefly mentioned by the reviewers. David and Nick felt let down by the battery life of their respective S20 and a S20+. David said he was “disappointed with my phone’s battery life compared to my previous phones, and the phones of others in my family.”
Cherlynn Low/Engadget
Meanwhile, Ryan and Jun Jie had the opposite experience. Jun Jie listed battery life as one of the many advantages of going with an S20 Ultra, and Ryan said the battery on his S20+ lasts “considerably longer than my S7, and I can use the phone all day without worrying about recharging.”
Comparisons
Our users were fairly critical with regards to comparing their handsets to other phone models. David said “one of my biggest frustrations with the S20 is the tediously slow on-screen fingerprint unlock, to the point that I am considering switching back to an LG V series.” He felt that “overall, the S20 is a satisfactory phone but … my previous flagship, the LG V30+, gave a better ownership experience.” Ryan, who upgraded to the S20+ from an S7, said it took him a few weeks to adjust to the size of the newer phone. Nick, who also owns an S20+, felt it was a bad thing that the handset “is so similar to all other A-series Samsungs that you cannot easily tell the difference. It’s not a very shiny flagship, as previous models were. I was twice as excited when I bought my S7 Edge, which it replaced.” Steve was pragmatic about his S20 Ultra, saying “this phone is good for a while but next time I’ll probably look at the ‘A’ series. Better bang for the buck.” Derek was less matter-of-fact about his S20 Ultra: “I’ve learned my lesson and this is the last S series phone I will buy. I’m going back to the Note phones I was buying. This phone was not worth the price.”
Cherlynn Low/Engadget
However, a few users of each handset were more pleased with their purchases. Sneak was “extremely glad that the Bixby button is gone, and I’m also glad that Samsung didn’t put the power and volume buttons on the ‘wrong’ side like they did with the Note 10 and 10+.” And Jun Jie and Charlie were both happy with their S20 Ultras, with Jun Jie stating there are “many praises to be sung about this phone,” and Charlie finding it an “incredible phone in many ways.”
The disparity between these two leagues can’t solely be explained by Twitch viewership. The 2019 Call of Duty finals drew a peak concurrent audience of just 182,000 on Twitch, according to The Esports Observer, and this year’s 330,000 figure was actually a record for the franchise. In terms of live esports audience numbers, Call of Duty can’t compete with League of Legends, and it consistently struggles to keep up with comparable first-person shooters like Counter-Strike or even Overwatch.
Activision’s solution? Change the metrics.
“We did have to change our mindset,” said Brandon Snow, who leads Activision’s esports partnerships, including its relationship with YouTube. “We had to move away a little bit from, ‘Hey, it’s all about how many people are watching at any one time,’ to, ‘It’s all about how many people are engaging with our content on the platform.’ Part of that is live and we’ve had some good live numbers, but part of it is also the content, and the shoulder content, and the VOD content that we put around it.”
Snow has been with Activision Blizzard for nearly three years, and before that he worked in traditional sports as an internal profit consultant for the NBA. He talked about maximizing the YouTube platform by pushing out fresh, entertaining videos that extend the brand and weave narratives around players, casters and popular personalities. The Call of Duty League YouTube channel reflects this effort, with videos like How to get the MOST kills in Warzone?! and IT’S CHAMPS TIME, WHO WILL BE CHAMPIONS?! (plus a few more titles that may or may not make sense with “?!” ending punctuation).
In this ecosystem, streaming a match and publishing the VOD is only part of the equation. It’s about creating original content, engaging and retaining viewers, and attracting people who aren’t drawn to live matches on their own.
“That’s a much different mindset than just running a live-event business on a platform like Twitch,” Snow said.
Activision’s quest to engage new viewers on YouTube reflects a broader strategy to expand the Call of Duty audience. With last year’s Modern Warfare, this meant making the game more friendly to new and casual players, with less of a focus on competitive features — tricky, when simultaneously attempting to establish an esports league. Modern Warfare still doesn’t have a ranked mode, and developers at Infinity Ward said they designed its maps to be easier for new players to navigate.
The reason Modern Warfare has trash maps is because they were designed to give new players a “safe space” to camp in LMFAOOO look how stunned he is when asked if the hardcore fans will be affected by this pic.twitter.com/kNy0bVGOmO
These moves have pissed off plenty of competitive players, including 100 Thieves CEO and former Call of Duty pro, Matthew “Nadeshot” Haag. He laid out some of his issues on an episode of The Mobcast in November, saying, “They made this game so that a casual player would have a great experience and that the best players and competitive players would have a negative experience.”
Nadeshot continued, referencing his former team, “If OpTic is not playing in the finals, there’s like 40,000 people watching a tournament. And I’m not even trying to sound like an asshole, that’s just the reality. …They have this thesis that if they create this league, they’re gonna have like 500,000 or a million people watching a Call of Duty tournament.” He called that idea “insane.”
This is all piled on top of recurring concerns about the league’s stability, partially tied to Activision’s annual release schedule for Call of Duty games, which means the competitive scene is uprooted each year. Star talent like Scump and FormaL are aging and no one is showing up to replace them. Players and fans are growing increasingly vocal about Activision’s perceived mismanagement of Call of Duty esports.
“They’re very passionate,” Snow said. “We learn a lot from that passion, and quite frankly, they have a lot of good ideas that we need to be very mindful of.”
The rebranded CDL held its inaugural season in 2020, alongside a new competition format, minimum salaries and benefits for players, and a location-based franchise model, with teams spending a reported $25 million apiece to participate. Twelve teams competed in 2020, and after a mid-year format change, all of them were ushered through to the playoffs.
Activision is playing a multi-pronged game, and esports are only part of it. Call of Duty is a blockbuster franchise outside of the pro scene, and its support systems are mainly geared toward engaging that massive, core player base, not catering to the smaller competitive crowd.
On YouTube, this means dropping more story- and personality-driven content in VOD format. The goal, Snow said, is to expand the CDL audience by creating videos that will attract lapsed or casual Call of Duty fans, and pull them into the pro scene.
Naturally, this also means diversification. Right now, the CDL audience is heavily male, with a large concentration in ages 18 to 35.
“We absolutely have to diversify and get more broad in who engages with the league,” Snow said, noting that inclusion was one of the top objectives for himself and CDL Commissioner Johanna Faries. “We want to deliver what the core likes to see, but we wanted to find ways to engage a broader audience, I’ll call it a casual gaming audience,” he said.
Snow described the Call of Duty market as a dartboard with 200 million points, or fans, on it. Hardcore players are the bullseye, but there are plenty more points in the surrounding layers.
“We’re very much focused on, how does the product we put out there in CDL not only engage with the center of our bullseye, but just tapping into that 200 million will help us engage a massive audience that should be much more broad than we are today,” Snow said. “And I think we’re laser-focused on figuring that out. And hopefully we’ll be trying a bunch of different things next year, around different products we’ve got in the Call of Duty franchise to help us get there.”
He didn’t specify what those products would be, but he talked about ongoing efforts to expand the Call of Duty brand, like Call of Duty Mobile, Warzone and Cold War. The goal is to stay true to the franchise while simultaneously adding components to tempt casual players, maintaining a robust competitive scene, and growing a new league exclusively on YouTube.
“It can be a balancing act, but we believe it’s possible,” Snow said.
Once OSIRIS-REx has landed, a robotic sampling arm will perform Touch-And-Go (TAG) collection. The mission is to collect at least two ounces, or 60 grams, of rocky material. If the first TAG attempt in October does not collect enough material, OSIRIS-REx has onboard nitrogen charges to power two more attempts.
The spacecraft is scheduled to depart Bennu in 2021. It will deliver the collected sample to Earth on September 24th, 2023. While that’s still two years away, NASA has already made useful discoveries through OSIRIS-REx. For instance, NASA has spotted water on the asteroid, and we now know that Bennu is spewing particles into space. The team has also produced some of the highest resolution images of a planetary body ever.