<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>faces &#8211; EFR Technology Group</title>
	<atom:link href="https://www.efrtechgroup.com/category/faces/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.efrtechgroup.com</link>
	<description>We maintain technology so you don't have to!</description>
	<lastBuildDate>Thu, 05 Dec 2019 20:44:00 +0000</lastBuildDate>
	<language>en</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Homeland Security doesn’t want Americans&#8217; airport face scans after all</title>
		<link>https://www.efrtechgroup.com/tech/homeland-security-doesnt-want-americans-airport-face-scans-after-all/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Thu, 05 Dec 2019 20:44:00 +0000</pubDate>
				<category><![CDATA[aclu]]></category>
		<category><![CDATA[biometric]]></category>
		<category><![CDATA[border]]></category>
		<category><![CDATA[cbp]]></category>
		<category><![CDATA[customs and border protection]]></category>
		<category><![CDATA[entry-exit]]></category>
		<category><![CDATA[face scan]]></category>
		<category><![CDATA[faces]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[homeland security]]></category>
		<category><![CDATA[Politics]]></category>
		<category><![CDATA[Privacy]]></category>
		<category><![CDATA[scan]]></category>
		<category><![CDATA[Security]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[tomorrow]]></category>
		<category><![CDATA[travelers]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/homeland-security-doesnt-want-americans-airport-face-scans-after-all/</guid>

					<description><![CDATA[[ad_1] In a statement provided to Engadget, a CBP spokesperson said: &#8220;U.S. Customs and Border Protection is using biometric facial comparison technology to facilitate the entry and exit of international travelers while meeting the Congressional mandate to implement a biometric entry-exit system. U.S. citizens are out of scope of the mandated biometric entry-exit program. However, [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>In a statement provided to Engadget, a CBP spokesperson said:</p>
<blockquote>
<p><small>&#8220;U.S. Customs and Border Protection is using biometric facial comparison technology to facilitate the entry and exit of international travelers while meeting the Congressional mandate to implement a biometric entry-exit system. U.S. citizens are out of scope of the mandated biometric entry-exit program. However, U.S. citizens are required to establish identity and citizenship to CBP and present a valid U.S. passport for international travel.</small></p>
</blockquote>
<p>CBP did consider including US citizens in its facial recognition checks to avoid the challenges of having separate processes for foreign nationals and US citizens, <a href="https://techcrunch.com/2019/12/05/homeland-security-drops-airport-citizens-face-scans/"><em>TechCrunch</em></a> reports. &#8220;Upon consultation with Congress and privacy experts, however, CBP determined that the best course of action is to continue to allow U.S. citizens to voluntarily participate in the biometric entry-exit program,&#8221; a CBP spokesperson said.</p>
<p>The ACLU, which spoke out against plans to conduct biometric scans on US citizens, is still concerned. In a statement, ACLU Senior Policy Analyst Jay Stanley said:</p>
<blockquote>
<p><small>&#8220;The Department of Homeland Security&#8217;s plans to spread face recognition surveillance nationwide remain alarming, especially given the lack of congressional authorization and sufficient safeguards, the government&#8217;s past security failures, and unanswered questions about the technology&#8217;s effectiveness, bias, and broader societal implications. The government cannot be trusted with this surveillance technology, and Congress should put the brakes on its use.&#8221;</small></p>
</blockquote></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/2019/12/05/homeland-security-cbp-biometric-face-scans/">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>These deepfake celebrity impressions are equally amazing and alarming</title>
		<link>https://www.efrtechgroup.com/tech/these-deepfake-celebrity-impressions-are-equally-amazing-and-alarming/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Fri, 11 Oct 2019 16:48:00 +0000</pubDate>
				<category><![CDATA[celebrity]]></category>
		<category><![CDATA[deepfake]]></category>
		<category><![CDATA[deepfakes]]></category>
		<category><![CDATA[faces]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[impressions]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[jim meskimen]]></category>
		<category><![CDATA[software]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[tomorrow]]></category>
		<category><![CDATA[Video]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/these-deepfake-celebrity-impressions-are-equally-amazing-and-alarming/</guid>

					<description><![CDATA[[ad_1] The video speaks to Meskimen&#8217;s talent as an impressionist but also to the capability of deepfake software. It proves how well the tech is able to blur the line between what&#8217;s real and what isn&#8217;t. Though, it still takes a ton of work. According to Sham00k, the full video took just over 250 hours [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>The video speaks to Meskimen&#8217;s talent as an impressionist but also to the capability of deepfake software. It proves how well the tech is able to blur the line between what&#8217;s real and what isn&#8217;t. Though, it still takes a ton of work. According to Sham00k, the full video took just over 250 hours of work, 1,200 hours of footage, 300,000 images and close to one terabyte of data to create.</p>
<p><center><iframe allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/5rPKeUXjEvE" width="560"></iframe></center></p>
<p>Tech companies and <a href="https://www.engadget.com/2018/09/14/lawmakers-concerned-deepfake-technology/">lawmakers have taken note</a>. Researchers are developing tools to spot deepfakes, and Google released 3,000 deepfakes in an attempt to further those efforts. <a href="https://www.engadget.com/2019/09/05/facebook-microsoft-mit-fight-deepfakes/">Facebook, MIT and Microsoft</a> are working to fight the fakes. <a href="https://www.engadget.com/2018/02/07/reddit-bans-deepfake-ai-porn/">Reddit</a> has banned AI-generated <a href="https://www.engadget.com/2018/01/30/fake-porn-is-the-new-fake-news-and-the-internet-isn-t-ready/">deepfake porn</a>, and <a href="https://www.engadget.com/2019/10/07/california-deepfake-pornography-politics/">California</a> now lets residents sue anyone who use the software to put their image in porn without consent. While everyone is up in arms over fake news, deepfakes are developing as the next frontier.</p>
<p><center><iframe allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/Wm3squcz7Aw" width="560"></iframe></center></p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/2019/10/11/deepfake-celebrity-impresonations/">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Adobe trained AI to detect facial manipulation in Photoshop</title>
		<link>https://www.efrtechgroup.com/ai/adobe-trained-ai-to-detect-facial-manipulation-in-photoshop/</link>
		
		<dc:creator><![CDATA[Randall]]></dc:creator>
		<pubDate>Fri, 14 Jun 2019 16:38:00 +0000</pubDate>
				<category><![CDATA[adobe]]></category>
		<category><![CDATA[Ai]]></category>
		<category><![CDATA[convolution neural network]]></category>
		<category><![CDATA[deepfake]]></category>
		<category><![CDATA[detection]]></category>
		<category><![CDATA[face away liquify]]></category>
		<category><![CDATA[faces]]></category>
		<category><![CDATA[facial expressions]]></category>
		<category><![CDATA[gear]]></category>
		<category><![CDATA[image manipulation]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[photoshop]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[Robots]]></category>
		<category><![CDATA[Security]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[tomorrow]]></category>
		<category><![CDATA[tool]]></category>
		<category><![CDATA[uc berkeley]]></category>
		<guid isPermaLink="false">https://www.efrtechgroup.com/adobe-trained-ai-to-detect-facial-manipulation-in-photoshop/</guid>

					<description><![CDATA[[ad_1] The team trained a convolutional neural network (CNN) to spot changes in images made with Photoshop&#8217;s Face Away Liquify feature, which was designed to change people&#8217;s eyes, mouth and other facial features. When put to the test, the neural network detected altered images up to 99 percent of the time. In comparison, people who [&#8230;]]]></description>
										<content:encoded><![CDATA[<p> [ad_1]<br />
</p>
<div>
<p>The team trained a convolutional neural network (CNN) to spot changes in images made with Photoshop&#8217;s <a href="https://helpx.adobe.com/photoshop/how-to/face-aware-liquify.html">Face Away Liquify</a> feature, which was designed to change people&#8217;s eyes, mouth and other facial features. When put to the test, the neural network detected altered images up to 99 percent of the time. In comparison, people who saw the same photos only spotted the changes 53 percent of the time. The tool was also able to revert images to what it predicted was their original state.</p>
<p>This isn&#8217;t the first time, Adobe has <a href="https://www.engadget.com/2018/06/22/adobe-photoshop-artificial-intelligence-fake-images/">used AI to spot photoshopped images</a>, but this work is specifically targeted at detecting facial manipulation. The company says the work is more pressing than ever. &#8220;We live in a world where it&#8217;s becoming harder to trust the digital information we consume,&#8221; said Adobe researcher Richard Zhang. And when it comes to spotting manipulated images and altered faces, Adobe says this is just the beginning.</p>
</p></div>
<p>[ad_2]<br />
<br /><a href="https://www.engadget.com/2019/06/14/adobe-ai-manipulated-images-faces-photoshop/">Source link </a></p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
