<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>bias &#8211; EFR Technology Group</title>
	<atom:link href="https://www.efrtechgroup.com/category/bias/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.efrtechgroup.com</link>
	<description>We maintain technology so you don't have to!</description>
	<lastBuildDate>Mon, 21 Sep 2020 19:56:13 +0000</lastBuildDate>
	<language>en</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>The LAPD has used facial recognition software 30,000 times since 2009</title>
		<link>https://www.efrtechgroup.com/tech/the-lapd-has-used-facial-recognition-software-30000-times-since-2009/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Mon, 21 Sep 2020 19:56:13 +0000</pubDate>
				<category><![CDATA[bias]]></category>
		<category><![CDATA[crime]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[police]]></category>
		<category><![CDATA[Privacy]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[tomorrow]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/the-lapd-has-used-facial-recognition-software-30000-times-since-2009/</guid>

					<description><![CDATA[[ad_1] For years, the Los Angeles Police Department (LAPD) hasn&#8217;t given a clear answer on whether it uses facial recognition in its policing work. That changed this week. On Monday, the agency told The Los Angeles Times it has used the technology nearly 30,000 times since late 2009.  The LAPD uses the Los Angeles County Regional Identification [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>For years, the Los Angeles Police Department (LAPD) hasn&#8217;t given a clear answer on whether it uses facial recognition in its policing work. That changed this week. On Monday, the agency <a href="https://www.latimes.com/california/story/2020-09-21/lapd-controversial-facial-recognition-software" target="_blank" rel="noopener noreferrer">told <em>The Los Angeles Times</em></a><em> </em>it has used the technology nearly 30,000 times since late 2009. </p>
<p>The LAPD uses the <a href="https://lacris.org/" target="_blank" rel="noopener noreferrer">Los Angeles County Regional Identification System</a> (LACRIS), a database of more than 9 million mugshots maintained by the Los Angeles County Sheriff&#8217;s Department. At one point, more than 500 LAPD personnel had access to the system, though the department claims that the number is closer to 300 in recent months. Josh Rubenstein, a spokesperson for the LAPD, said he couldn&#8217;t be sure how many arrests LACRIS has helped the police department make. However, he said, &#8220;No individuals are arrested by the LAPD based solely on facial recognition results.”</p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/lapd-facial-recogntion-dataworks-plus-195613849.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Twitter has &#8216;more analysis to do&#8217; after algorithm shows possible racial bias</title>
		<link>https://www.efrtechgroup.com/tech/twitter-has-more-analysis-to-do-after-algorithm-shows-possible-racial-bias/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Sun, 20 Sep 2020 20:33:37 +0000</pubDate>
				<category><![CDATA[algorithm]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[racism]]></category>
		<category><![CDATA[Social Media]]></category>
		<category><![CDATA[social network]]></category>
		<category><![CDATA[social networking]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[Twitter]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/twitter-has-more-analysis-to-do-after-algorithm-shows-possible-racial-bias/</guid>

					<description><![CDATA[[ad_1] Twitter is learning first-hand about the challenges of eliminating racial bias in algorithms. The social network’s Liz Kelley said the company had “more analysis” to do after cryptographic engineer Tony Arcieri conducted an experiment suggesting Twitter’s algorithm was biased in prioritizing photos. When attaching photos of Barack Obama and Mitch McConnell to tweets, Twitter [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>Twitter is learning first-hand about the challenges of <a href="https://search.engadget.com/click/_ylt=AwrJ7FtPs2dfAPoA7gx8BWVH;_ylu=Y29sbwNiZjEEcG9zAzEEdnRpZAMEc2VjA3Nj/RV=2/RE=1600660431/RO=10/RU=https%3a%2f%2fwww.engadget.com%2f2018-02-12-facial-analysis-ai-has-racial-gender-bias.html/RK=2/RS=P6v1TglesfM6BwmTVd81ja1UNIE-">eliminating racial bias</a> in algorithms. The social network’s Liz Kelley <a href="https://twitter.com/lizkelley/status/1307742267193532416?s=21" target="_blank" rel="noopener noreferrer">said</a> the company had “more analysis” to do after cryptographic engineer Tony Arcieri <a href="https://twitter.com/bascule/status/1307440596668182528" target="_blank" rel="noopener noreferrer">conducted</a> an experiment suggesting Twitter’s algorithm was biased in prioritizing photos. When attaching photos of <a href="https://www.engadget.com/2017-01-21-obamas-legacy-the-most-tech-savvy-president.html">Barack Obama</a> and <a href="https://www.engadget.com/2019-08-09-twitter-unfreezes-mitch-mcconnells-campaign-account-after-revie.html">Mitch McConnell</a> to tweets, Twitter seemed to exclusively highlight McConnell’s face — Obama only popped up when Arcieri inverted the colors, making skin color a non-issue.</p>
<p>Others tried reversing photo and name orders to no avail. A higher-contrast smile did work, Intertheory’s Kim Sherrell <a href="https://twitter.com/kim/status/1307548258491801600" target="_blank" rel="noopener noreferrer">found</a>. Scientist Matt Blaze, meanwhile, <a href="https://twitter.com/mattblaze/status/1307464872398147584" target="_blank" rel="noopener noreferrer">noticed</a> that the priority seemed to vary depending on the official Twitter app used. Tweetdeck was more neutral, for instance.</p>
</p></div>
<p><script async src="http://platform.twitter.com/widgets.js" charset="utf-8"></script><br />
<br />[ad_2]<br />
<br /><a href="https://www.engadget.com/twitter-responds-to-algorithm-racial-bias-claims-203337604.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Recommended Reading: The Magic Leap project the world may never see</title>
		<link>https://www.efrtechgroup.com/tech/recommended-reading-the-magic-leap-project-the-world-may-never-see/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Sat, 11 Jul 2020 14:00:38 +0000</pubDate>
				<category><![CDATA[bias]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Magic Leap]]></category>
		<category><![CDATA[mixed reality]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[quibi]]></category>
		<category><![CDATA[rec reading]]></category>
		<category><![CDATA[recommended reading]]></category>
		<category><![CDATA[streaming]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[Twitter]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/recommended-reading-the-magic-leap-project-the-world-may-never-see/</guid>

					<description><![CDATA[[ad_1] Fading light: The story of Magic Leap’s lost mixed reality magnum opusAdi Robertson, The VergeLike a lot of companies this year, Magic Leap faced massive layoffs. The company was able to avoid those after it raised $350 million, but it did shift to e&#8230; [ad_2] Source link]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
<br /><img decoding="async" src="https://www.efrtechgroup.com/wp-content/uploads/2020/07/Recommended-Reading-The-Magic-Leap-project-the-world-may-never.jpeg" />Fading light: The story of Magic Leap’s lost mixed reality magnum opusAdi Robertson, The VergeLike a lot of companies this year, Magic Leap faced massive layoffs. The company was able to avoid those after it raised $350 million, but it did shift to e&#8230;<br />
<br />[ad_2]<br />
<br /><a href="https://www.engadget.com/recommended-reading-the-magic-leap-project-the-world-may-never-see-133038720.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Boston bans police and city use of facial recognition software</title>
		<link>https://www.efrtechgroup.com/tech/boston-bans-police-and-city-use-of-facial-recognition-software/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Wed, 24 Jun 2020 19:33:03 +0000</pubDate>
				<category><![CDATA[aclu]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[boston]]></category>
		<category><![CDATA[crime]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[police]]></category>
		<category><![CDATA[Privacy]]></category>
		<category><![CDATA[racism]]></category>
		<category><![CDATA[Tech]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/boston-bans-police-and-city-use-of-facial-recognition-software/</guid>

					<description><![CDATA[[ad_1] The ban comes after software called DataWorks Plus and human error led to the wrongful arrest of a Black man in Detroit. Police arrested Robert Williams after the software incorrectly matched his driver&#8217;s license photo to security footage from the scene of a burglary. The incident is the first known instance of a wrongful [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>The ban comes after software called DataWorks Plus and human error led to the <a href="https://www.engadget.com/facial-recognition-wrongful-arrest-michigan-141531393.html">wrongful arrest of a Black man in Detroit</a>. Police arrested Robert Williams after the software incorrectly matched his driver&#8217;s license photo to security footage from the scene of a burglary. The incident is the first known instance of a wrongful arrest in the US based on an incorrect facial recognition match, according to the American Civil Liberties Union (ACLU). However, research suggests most facial recognition software isn&#8217;t great at correctly identifying people of color. A 2018 <a href="https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212" class="_e75a791d-denali-editor-page-rtfLink" target="_blank" rel="noopener noreferrer">MIT study</a>, for instance, found that three commercially available programs incorrectly identified dark-skinned women as much as 34.7 percent of the time. </p>
<p>&#8220;This is a crucial victory for our privacy rights and for people like Robert Williams, who have been arrested for crimes they didn&#8217;t commit because of a technology law enforcement shouldn&#8217;t be using,” said ACLU of Massachusetts executive director Carol Rose. “Lawmakers nationwide should follow suit and immediately stop law enforcement use of this technology. This surveillance technology is dangerous when right, and dangerous when wrong.&#8221;</p>
<p>It&#8217;s worth pointing out that once Boston Mayor Marty Walsh signs the bill into law, it won&#8217;t prevent federal agencies like the FBI from using the tech while conducting investigations in the city. Still, with cities like <a href="https://www.engadget.com/2019-05-14-san-francisco-bans-city-use-of-facial-recognition.html" class="_e75a791d-denali-editor-page-rtfLink">San Francisco</a> and Boston limiting the use of the technology, other jurisdictions may do the same. </p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/boston-bans-facial-recognition-193303141.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>False facial recognition match leads to a wrongful arrest in Detroit</title>
		<link>https://www.efrtechgroup.com/tech/false-facial-recognition-match-leads-to-a-wrongful-arrest-in-detroit/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Wed, 24 Jun 2020 14:15:31 +0000</pubDate>
				<category><![CDATA[aclu]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[civil liberties]]></category>
		<category><![CDATA[crime]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[police]]></category>
		<category><![CDATA[Privacy]]></category>
		<category><![CDATA[racism]]></category>
		<category><![CDATA[Tech]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/false-facial-recognition-match-leads-to-a-wrongful-arrest-in-detroit/</guid>

					<description><![CDATA[[ad_1] Many critics of police facial recognition use warn of the potential for racial bias that leads to false arrests, and unfortunately that appears to have happened. The ACLU has filed a complaint against Detroit police for the wrongful arrest of Robert Williams when a DataWorks Plus facial recognition system incorrectly matched security footage against [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>Many critics of police facial recognition use warn of the <a href="https://www.engadget.com/2018-02-12-facial-analysis-ai-has-racial-gender-bias.html">potential for racial bias</a> that leads to false arrests, and unfortunately that appears to have happened. The ACLU has <a href="https://www.aclu.org/press-releases/man-wrongfully-arrested-because-face-recognition-cant-tell-black-people-apart" target="_blank" rel="noopener noreferrer">filed a complaint</a> against Detroit police for the wrongful arrest of Robert Williams when a DataWorks Plus facial recognition system incorrectly matched security footage against Williams’ driver’s license, marking him as a suspect. Officers showed the match to an offsite security consultant who identified Williams as the culprit, but this person never saw the perpetrator first-hand.</p>
<p>The ACLU argued that the DataWorks system “can’t tell Black people apart” and that the whole system was “tainted” by officers’ assumptions that the facial recognition system produced the right suspect. In a <em>Washington Post</em> <a href="https://www.washingtonpost.com/opinions/2020/06/24/i-was-wrongfully-arrested-because-facial-recognition-why-are-police-allowed-use-this-technology/" target="_blank" rel="noopener noreferrer">opinion piece</a>, Williams added that he was concerned about the tech even if it was completely accurate — he didn’t want his daughters’ faces to go into a database and prompt future police questioning when they’re spotted at a “protest the government didn’t like.”</p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/facial-recognition-wrongful-arrest-michigan-141531393.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>IBM stops work on facial recognition over human rights concerns</title>
		<link>https://www.efrtechgroup.com/ai/ibm-stops-work-on-facial-recognition-over-human-rights-concerns/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Tue, 09 Jun 2020 01:02:15 +0000</pubDate>
				<category><![CDATA[Ai]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[arvind krishna]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[human rights]]></category>
		<category><![CDATA[ibm]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Privacy]]></category>
		<category><![CDATA[racism]]></category>
		<category><![CDATA[surveillance]]></category>
		<category><![CDATA[Tech]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/ibm-stops-work-on-facial-recognition-over-human-rights-concerns/</guid>

					<description><![CDATA[[ad_1] Krishna’s letter was part of a broader call on Congress to push for broader police accountability and conduct reforms, including some that were already part of the recently introduced Justice in Policing Act of 2020. The move comes in the midst of protests over police brutality and discrimination, and not long after Clearview AI’s [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>Krishna’s letter was part of a broader call on Congress to push for broader police accountability and conduct reforms, including some that were already part of the recently introduced <a href="https://www.engadget.com/police-reform-bill-body-cameras-215036775.html">Justice in Policing Act of 2020</a>.</p>
<p>The move comes in the midst of protests over police brutality and discrimination, and not long after <a href="https://www.engadget.com/aclu-sues-clearview-ai-164518487.html">Clearview AI’s</a> facial recognition raised privacy and bias issues. More than one report has indicated that facial recognition systems <a href="https://www.engadget.com/2018-02-12-facial-analysis-ai-has-racial-gender-bias.html">can be biased</a> against non-whites and women, particularly if the training data includes relatively few people from those groups. And while some facial recognition systems may only correlate faces with publicly available data, there are concerns this could be used for tracking and profile generation that could be used to intimidate people or otherwise limit their real-world privacy.</p>
<p>As <em>CNBC</em> noted, it’s relatively easy for IBM to back out when facial recognition wasn’t a major contributor to its bottom line. The media buzz may be as important as anything. IBM is still a major company, though, and it frequently works with governments. This could spur other providers to follow suit, and might even get some would-be customers to drop facial recognition entirely.</p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/ibm-exits-facial-recognition-business-012915316.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Banjo CEO resigns to preserve the company&#8217;s AI surveillance deals</title>
		<link>https://www.efrtechgroup.com/ai/banjo-ceo-resigns-to-preserve-the-companys-ai-surveillance-deals/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Sun, 10 May 2020 02:05:23 +0000</pubDate>
				<category><![CDATA[Ai]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[banjo]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[damien patton]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[mass surveillance]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Privacy]]></category>
		<category><![CDATA[racism]]></category>
		<category><![CDATA[surveillance]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[utah]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/banjo-ceo-resigns-to-preserve-the-companys-ai-surveillance-deals/</guid>

					<description><![CDATA[[ad_1] Utah had put its surveillance contract with Banjo on hold after learning that Patton had been a KKK member as a teenager, and had joined a group leader in a drive-by shooting. Patton had renounced his past and vowed that it didn’t affect his company’s practices, but Utah paused its use of the technology [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>Utah had put its surveillance contract with Banjo on hold after learning that Patton had been a KKK member as a teenager, and had joined a group leader in a drive-by shooting. Patton had renounced his past and vowed that it didn’t affect his company’s practices, but Utah paused its use of the technology and launched an audit to verify that there wasn’t an <a href="https://www.engadget.com/2019-01-25-amazon-rekognition-facial-analysis-gender-race-bias-mit.html">algorithmic bias</a> in its data gathering from cameras, call centers and emergency vehicles.</p>
<p>It’s not clear how Patton’s exit will influence Utah’s response, if at all. However, it theoretically eliminates the possibility that the history of Banjo’s founder will play a role in future projects. Not that this eliminates underlying concerns about the surveillance itself. Critics are still concerned that Banjo’s system has access to <a href="https://www.engadget.com/2020-03-04-banjo-ai-utah-law-enforcement-surveillance.html">vast amounts of information</a> in real time, and it’s not clear how well the company scrubs out personal data.</p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/banjo-ceo-resigns-020523666.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Utah pauses Banjo&#8217;s AI surveillance after learning of owner&#8217;s racist past</title>
		<link>https://www.efrtechgroup.com/ai/utah-pauses-banjos-ai-surveillance-after-learning-of-owners-racist-past/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Tue, 28 Apr 2020 20:52:43 +0000</pubDate>
				<category><![CDATA[Ai]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[banjo]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[damien patton]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[mass surveillance]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Privacy]]></category>
		<category><![CDATA[racism]]></category>
		<category><![CDATA[surveillance]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[utah]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/utah-pauses-banjos-ai-surveillance-after-learning-of-owners-racist-past/</guid>

					<description><![CDATA[[ad_1] Utah is putting its AI surveillance system on ice after learning of its creator’s background. The state has suspended (via Salt Lake Tribune) Banjo’s contract after learning from a OneZero report that company head Damien was part of the Dixie Knights of the Ku Klux Klan as a teenager and joined the racist group’s [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>Utah is putting its <a href="https://www.engadget.com/2020-03-04-banjo-ai-utah-law-enforcement-surveillance.html">AI surveillance system</a> on ice after learning of its creator’s background. The state has <a href="https://attorneygeneral.utah.gov/statement-on-reports-against-banjo-founder/" target="_blank" rel="noopener noreferrer">suspended</a> (via <a href="https://www.sltrib.com/news/politics/2020/04/28/utah-attorney-general/?utm_source=pushly" target="_blank" rel="noopener noreferrer"><em>Salt Lake Tribune</em></a>) Banjo’s contract after learning from a <em>OneZero</em> <a href="https://onezero.medium.com/ceo-of-surveillance-firm-banjo-once-helped-kkk-leader-shoot-up-synagogue-fdba4ad32829" target="_blank" rel="noopener noreferrer">report</a> that company head Damien was part of the Dixie Knights of the Ku Klux Klan as a teenager and joined the racist group’s leader in an anti-Semitic drive-by shooting. While Patton has expressed remorse for his past, according to Utah Attorney General Sean Reyes, officials were concerned enough that they felt it was safer to put an advisory committee and independent audit in place to tackle concerns like privacy and “possible bias.”</p>
<p>Banjo’s deal with Utah lets it <a href="https://www.engadget.com/2020-03-07-recommended-reading-banjo-ai-surveillance-utah.html">combine data</a> from city infrastructure (such as public cameras and 911) with internet content to spot “anomalies,” theoretically detecting some crimes as they happen. The firm is supposed to strip all personal data from the system, but the methods and effectiveness aren’t clear. There’s also the matter of AI bias. Facial recognition systems sometimes have <a href="https://www.engadget.com/2019-01-25-amazon-rekognition-facial-analysis-gender-race-bias-mit.html">gender and race biases</a> that lead to false matches — a particular problem when it could lead to wrongful arrests and confrontations.</p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/utah-suspends-use-of-banjo-surveillance-205243913.html">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Court finds algorithm bias studies don&#8217;t violate US anti-hacking law</title>
		<link>https://www.efrtechgroup.com/tech/court-finds-algorithm-bias-studies-dont-violate-us-anti-hacking-law/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Sun, 29 Mar 2020 20:49:00 +0000</pubDate>
				<category><![CDATA[aclu]]></category>
		<category><![CDATA[algorithm]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[civil liberties]]></category>
		<category><![CDATA[discrimination]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[law]]></category>
		<category><![CDATA[Politics]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[web]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/court-finds-algorithm-bias-studies-dont-violate-us-anti-hacking-law/</guid>

					<description><![CDATA[[ad_1] Bates observed that many sites&#8217; terms of service (which are frequently buried, cryptic or both) didn&#8217;t provide a good-enough notice to make people criminally liable, and that it&#8217;s problematic for private sites to define criminal liability. The judge also found that the government was using an overly broad interpretation when it&#8217;s supposed to use [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>Bates observed that many sites&#8217; <a href="https://www.engadget.com/2020-02-22-google-new-terms-of-service.html">terms of service</a> (which are frequently buried, cryptic or both) didn&#8217;t provide a good-enough notice to make people criminally liable, and that it&#8217;s problematic for private sites to define criminal liability.  The judge also found that the government was using an overly broad interpretation when it&#8217;s supposed to use a narrow view whenever there&#8217;s ambiguity.</p>
<p>It&#8217;s not certain if the government intends to contest the ruling.  If it doesn&#8217;t (or loses), however, this effectively greenlights future bias studies without the approval of site operators.  Facebook and other social networks could still have the power to kick researchers off their networks or file civil suits, but they couldn&#8217;t threaten federal charges and prison time.</p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/2020/03/29/court-ruling-allows-algorithm-bias-studies/">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Amazon wants to question Trump over his role in the $10 billion DoD contract</title>
		<link>https://www.efrtechgroup.com/tech/amazon-wants-to-question-trump-over-his-role-in-the-10-billion-dod-contract/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Mon, 10 Feb 2020 16:49:00 +0000</pubDate>
				<category><![CDATA[Amazon]]></category>
		<category><![CDATA[aws]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[business]]></category>
		<category><![CDATA[contract]]></category>
		<category><![CDATA[department of defense]]></category>
		<category><![CDATA[depose]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[jedi]]></category>
		<category><![CDATA[lawsuit]]></category>
		<category><![CDATA[pentagon]]></category>
		<category><![CDATA[Politics]]></category>
		<category><![CDATA[president trump]]></category>
		<category><![CDATA[Security]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[trump administration]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/amazon-wants-to-question-trump-over-his-role-in-the-10-billion-dod-contract/</guid>

					<description><![CDATA[[ad_1] In court documents unsealed Monday, Amazon said it is looking to depose &#8220;individuals who were instrumental&#8221; in the JEDI selection process, including Dana Deasy, the Defense Department&#8217;s chief information officer, Defense Secretary Mark Esper and former Defense Secretary James Mattis. The Department of Defense (DoD) awarded the $10 billion JEDI contract to Microsoft last [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>In court documents unsealed Monday, Amazon said it is looking to depose &#8220;individuals who were instrumental&#8221; in the JEDI selection process, including Dana Deasy, the Defense Department&#8217;s chief information officer, Defense Secretary Mark Esper and former Defense Secretary James Mattis.</p>
<p>The Department of Defense (DoD) awarded the $10 billion JEDI contract to Microsoft last October. The project is meant to modernize the DoD&#8217;s cloud infrastructure and connect different divisions within the agency, which currently has over 500 separate clouds. Amazon quickly challenged the decision and <a href="https://www.engadget.com/2019/11/23/amazon-sues-over-microsoft-jedi-contract/">filed a lawsuit</a>. The company claims Trump&#8217;s <a href="https://www.engadget.com/2019/12/09/amazon-jedi-lawsuit-trump-vendetta/">&#8220;personal vendetta&#8221;</a> and instruction to &#8220;screw Amazon&#8221; cost it the contract.</p>
<p>In a statement provided to <em>CNBC</em>, an Amazon Web Service spokesperson said:</p>
<blockquote>
<p><small>&#8220;President Trump has repeatedly demonstrated his willingness to use his position as President and Commander in Chief to interfere with government functions – including federal procurements – to advance his personal agenda. The preservation of public confidence in the nation&#8217;s procurement process requires discovery and supplementation of the administrative record, particularly in light of President Trump&#8217;s order to &#8216;screw Amazon.&#8217; The question is whether the President of the United States should be allowed to use the budget of the DoD to pursue his own personal and political ends.&#8221;</small></p>
</blockquote></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/2020/02/10/amazon-trump-jedi-contract/">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
