<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.9.5">Jekyll</generator><link href="https://blog.lockdownprivacy.com/feed.xml" rel="self" type="application/atom+xml" /><link href="https://blog.lockdownprivacy.com/" rel="alternate" type="text/html" /><updated>2024-06-14T15:00:24+00:00</updated><id>https://blog.lockdownprivacy.com/feed.xml</id><title type="html">Transparency Matters</title><subtitle>Nuanced takes on transparency, privacy, and incentives.</subtitle><author><name>Lockdown Privacy</name></author><entry><title type="html">Lockdown 2.0: How We’re Getting There</title><link href="https://blog.lockdownprivacy.com/2023/02/23/Lockdown-2-How-Were-Getting-There.html" rel="alternate" type="text/html" title="Lockdown 2.0: How We’re Getting There" /><published>2023-02-23T00:00:00+00:00</published><updated>2023-02-23T00:00:00+00:00</updated><id>https://blog.lockdownprivacy.com/2023/02/23/Lockdown-2-How-Were-Getting-There</id><content type="html" xml:base="https://blog.lockdownprivacy.com/2023/02/23/Lockdown-2-How-Were-Getting-There.html"><![CDATA[<h4 id="scaling-privacy-to-the-next-10-million-users">Scaling Privacy to the Next 10 Million Users</h4>
<!--more-->

<p>We’re pleased to announce that Lockdown Privacy has been acquired by Appex Group, a privacy-conscious mobile development group fully owned and operated by passionate technologists and engineers. Appex is headquartered in Boston, Massachusetts, and is led by open source veterans with extensive background in security. The Appex team was immediately attracted to Lockdown’s industry-leading transparency, and saw that by prioritizing trust through transparency, Lockdown was able to grow organically to over a million users, instead of relying on invasive ads or marketing - a win for both user privacy and for Lockdown.</p>

<p>Going forward, not only is the Appex team committed to operating Lockdown as open source, but they’re also heavily investing in building out new privacy features and performance enhancements, such as a next-generation tracker-blocking engine that’s exponentially more efficient - updates that our two-person team didn’t previously have the resources for.</p>

<p>For us as founders, we’re happy that Lockdown’s new operators are folks who believe in the importance of privacy, and who arguably have much deeper experience with open source than us. Of course, actions speak louder than words, so this month, Appex is completing Lockdown’s fifth audit by independent security and privacy experts - the update will be posted and announced in the next few weeks on OpenAudit.</p>

<p>We look forward to continuing to work with Appex as strategic advisors - and if you have any questions, concerns, or feedback, you can reach the Appex team at <a href="mailto:team@lockdownprivacy.com">team@lockdownprivacy.com</a>.</p>

<p>Best,</p>

<p>Appex &amp; Lockdown Privacy</p>]]></content><author><name>Lockdown Privacy</name></author><summary type="html"><![CDATA[Scaling Privacy to the Next 10 Million Users]]></summary></entry><entry><title type="html">Study: Effectiveness of Apple’s App Tracking Transparency</title><link href="https://blog.lockdownprivacy.com/2021/09/22/study-effectiveness-of-apples-app-tracking-transparency.html" rel="alternate" type="text/html" title="Study: Effectiveness of Apple’s App Tracking Transparency" /><published>2021-09-22T00:00:00+00:00</published><updated>2021-09-22T00:00:00+00:00</updated><id>https://blog.lockdownprivacy.com/2021/09/22/study-effectiveness-of-apples-app-tracking-transparency</id><content type="html" xml:base="https://blog.lockdownprivacy.com/2021/09/22/study-effectiveness-of-apples-app-tracking-transparency.html"><![CDATA[<h4 id="does-it-stop-third-party-tracking-or-is-it-just-an-illusion-of-privacy">Does it stop third-party tracking? Or is it just an illusion of privacy?</h4>
<!--more-->

<p><em>This study, conducted by Lockdown Privacy, was featured and summarized by The Washington Post in <a href="https://www.washingtonpost.com/technology/2021/09/23/iphone-tracking/">September 2021</a>.</em></p>
<h3 id="summary">Summary</h3>
<p>In April 2021, Apple released the App Tracking Transparency (“ATT”) feature with iOS 14.5. ATT claims to give users choice and transparency for third-party tracking in their apps, and it was lauded by many as a step forward in protecting user privacy. Does it really work? Five months after its release, we tested ten of the top apps in the App Store to see if ATT succeeds in stopping tracking.</p>

<p>Using the open source <a href="https://lockdownprivacy.com">Lockdown Privacy</a> app and manual testing, we found that <strong>App Tracking Transparency made no difference in the total number of active third-party trackers, and had a minimal impact on the total number of third-party tracking connection attempts. We further confirmed that detailed personal or device data was being sent to trackers in almost all cases.</strong> ATT was functionally useless in stopping third-party tracking, even when users explicitly choose “Ask App Not To Track”.</p>

<h3 id="who-we-are">Who We Are</h3>
<p>We’re ex-Apple engineers whose mission is to increase transparency in technology. We’ve spent the last four years creating open source privacy software (<a href="https://lockdownprivacy.com">Lockdown Privacy</a>), writing informative content (<a href="https://blog.lockdownprivacy.com">Transparency Matters</a>, <a href="https://privacyreview.co">Privacy Review</a>, <a href="https://openlyoperated.org">Openly Operated</a>), and building free tools to help developers become more transparent (<a href="https://openaudit.com/tutorial">OpenAudit</a>).</p>

<p>Unlike most privacy companies that ask users to trust their unproven Privacy Policies, everything we build is open source, so all the results and data in this report can be replicated and verified by anyone with an iOS device. And because we’re also Openly Operated, all our claims are fully backed up by source code, infrastructure, and audit trails — verifiable by anyone with an internet connection.</p>
<h3 id="background--objective">Background + Objective</h3>
<p>It’s no secret that Apple tries to brand itself as a privacy-conscious counterpart to its rivals Google and Facebook, whose ad-based businesses are inherently at odds with privacy. This privacy-as-marketing approach sounds great in ads, but in practice, it’s often <a href="https://twitter.com/lockdown_hq/status/1405298060171923462">less-than-honest</a> or can even <a href="https://www.fastcompany.com/90591586/apple-privacy-nutrition-labels-flaws">be counterproductive</a>. In this study, we test the effectiveness of Apple’s latest privacy feature: App Tracking Transparency.</p>

<p>App Tracking Transparency claims to give users a choice in allowing or requesting to disallow third-party tracking in apps. The average iOS user will, while using an app, encounter ATT with this dialog:
<img src="https://privacyreview-site-assets.s3.amazonaws.com/images/att-dialog.jpg" alt="The standard ATT dialog with message: &quot;Allow [insert app name here]&quot; to track your activity across other companies' apps and websites?&quot; and two choices &quot;Ask App Not to Track&quot; and &quot;Allow&quot;" /></p>

<p>The user’s choice will then show up in the Settings app, under Privacy — Tracking:
<img src="https://privacyreview-site-assets.s3.amazonaws.com/images/settings-tracking.jpg" alt="Screenshot of the Settings, Privacy, Tracking page, toggle 'Allow Apps to Request to Track' to ON, text &quot;Allow apps to ask to track your activity across other companies' apps and websites. When this is off, all new app tracking requests are automatically denied. Then Doordash shows switch OFF and Facebook shows switch OFF." /></p>

<p>Since the two screens above are the entirety of what most iOS users are going to see about ATT, we’ll use these dialogs as the standard for our study. In the example above, if ATT is working as advertised, this means that for the DoorDash app, “all new app tracking requests are automatically denied” and the user has asked the app to not track their “activity across other companies’ apps and websites.”</p>

<p>The objective of this study is to measure how well ATT lives up to Apple’s own claims, and to answer the question: How effective is App Tracking Transparency at stopping third-party tracking activity?</p>
<h3 id="methodology">Methodology</h3>

<p>To test how effective App Tracking Transparency works, we selected ten top ranked apps, most of which were featured by Apple’s own App Store Editors under either the Apps or Games tabs. We used <a href="https://itunes.apple.com/app/apple-store/id1469783711">Lockdown Privacy v1.2.4</a> to detect (and block) third-party trackers, and manual testing for uncovering the actual content being sent. Some manual testing later uncovered more trackers, which were also included in the results.</p>

<p>We enabled the following Block Lists: Amazon Trackers, Data Trackers, Email Trackers, Facebook Trackers, Game Marketing, General Marketing, Google Shopping, Marketing Trackers, Marketing Trackers II, Reporting. The test device was an iPhone XR initially running iOS 14.8. We later re-tested with the final public release of iOS 15 and found that iOS’s ATT behavior, tracking data, the number of trackers were identical. Tests were conducted during August and September.</p>

<p>Since Lockdown Privacy is free and open source, all data and results in this can be verified by anyone. Due to app updates and specific usage behaviors, independent verification results may differ slightly — for example, using a test app for a longer period of time will likely result in more tracking activity.</p>

<p><strong>Test Process</strong></p>

<p>We tested each app twice: first with choosing ATT’s “Ask App Not To Track”, and then again with choosing ATT’s “Allow [Tracking]”. In each test, we did a clean signup and basic usage of the app for no more than two minutes. After each test, we recorded what tracking activity was detected in Lockdown Privacy’s Block Log, and then reset everything to test the next case.</p>

<p><strong>Simplified Test Example</strong></p>

<p>The following is a shortened version of the test process for the Yelp app. It shows a Settings panel that indicates Tracking is toggled off for Yelp, and then launches the Yelp app. Finally, viewing the tracker Block Log in Lockdown Privacy, we document the dozens of tracking attempts by Yelp’s third-party trackers that iOS’s ATT has allowed.</p>

<div style="width: 100%; text-align: center; margin-bottom: 20px;">
	<video height="450" controls="" autoplay="" muted="" style="display: block; margin-left: auto; margin-right: auto;">
		<source src="https://privacyreview-site-assets.s3.amazonaws.com/images/yelp.mp4" type="video/mp4" />
		Your browser does not support the video tag.
	</video>
</div>
<h3 id="results--data">Results + Data</h3>
<h4 id="tracking-activity-in-top-ranked-apps-att-on-vs-off">Tracking Activity in Top Ranked Apps: ATT On Vs Off</h4>
<table style="text-align:center;">
	<tr style="font-weight:600;">
		<td>App</td><td colspan="2">Active Trackers Found</td><td colspan="2">Tracking Attempts<br />(Signup &amp; Basic Usage)</td>
	</tr>
	<tr style="font-weight: 600;">
		<td></td>
		<td style="padding: 4px 0px"><small>Ask App Not To Track</small></td>
		<td style="padding: 4px 0px"><small>Allow Tracking</small></td>
		<td style="padding: 4px 0px"><small>Ask App Not To Track</small></td>
		<td style="padding: 4px 0px"><small>Allow Tracking</small></td>
	</tr>
	<tr>
		<td style="padding: 0px 4px;"><strong>Yelp</strong><br /><small>v12.19.0</small><br /><small><strong>#4 Food &amp; Drink</strong></small></td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Facebook Graph</li>
				<li>Bugsnag</li>
				<li>Comscore</li>
				<li>Branch</li>
				<li>Appboy</li>
				<li>Adjust</li>
			</ul>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Facebook Graph</li>
				<li>Bugsnag</li>
				<li>Comscore</li>
				<li>Branch</li>
				<li>Appboy</li>
				<li>Adjust</li>
			</ul>
		</td>
		<td>39</td>
		<td>42</td>
	</tr>
	<tr>
		<td style="padding: 8px 4px;"><strong>Telegram</strong><br /><small>v7.8.4</small><br /><small><strong>#5 Social Networking</strong></small></td>
		<td colspan="2" style="margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
			No Trackers Found
		</td>
		<td colspan="2">0</td>
	</tr>
	<tr>
		<td style="padding: 0px 4px;"><strong>Grubhub</strong><br /><small>v2021.28</small><br /><small><strong>#7 Food &amp; Drink</strong></small></td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Facebook Graph</li>
				<li>New Relic</li>
				<li>Branch</li>
				<li>Google Analytics</li>
				<li>Braze</li>
				<li>AppsFlyer</li>
			</ul>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Facebook Graph</li>
				<li>New Relic</li>
				<li>Branch</li>
				<li>Google Analytics</li>
				<li>Braze</li>
				<li>AppsFlyer</li>
			</ul>
		</td>
		<td>222</td>
		<td>302</td>
	</tr>
	<tr>
		<td style="padding: 0px 4px;"><strong>Run Rich 3D</strong><br /><small>v1.5.4</small><br /><small><strong>#1 Simulation</strong></small></td>
		<td style="text-align:left;">
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Vungle</li>
				<li>Google Analytics</li>
				<li>Mopub</li>
				<li>Applovin</li>
				<li>Google Ads</li>
				<li>Chartboost</li>
				<li>Adjust</li>
			</ul>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Vungle</li>
				<li>Google Analytics</li>
				<li>Mopub</li>
				<li>Applovin</li>
				<li>Google Ads</li>
				<li>Chartboost</li>
				<li>Adjust</li>
			</ul>
		</td>
		<td>53</td>
		<td>53</td>
	</tr>
	<tr>
		<td style="padding: 0px 4px;"><strong>Starbucks</strong><br /><small>v6.10</small><br /><small><strong>#5 Food &amp; Drink</strong></small></td>
		<td colspan="2" style="text-align:left;">
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Branch</li>
				<li>New Relic</li>
				<li>Google Analytics</li>
			</ul>
			<small><i>App Did Not Ask For Tracking Permission</i></small>
		</td>
		<td colspan="2">
			21
		</td>
	</tr>
	<tr>
		<td style="padding: 0px 4px;"><strong>Streamer Life!</strong><br /><small>v1.1</small><br /><small><strong>#1 RPG</strong></small></td>
		<td style="text-align:left;">
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Mobvista</li>
				<li>Applovin</li>
				<li>Vungle</li>
				<li>Ironsource</li>
				<li>Mopub</li>
				<li>Facebook Graph</li>
				<li>Google Ads</li>
				<li>Chartboost</li>
				<li>Adjust</li>
			</ul>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Mobvista</li>
				<li>Applovin</li>
				<li>Vungle</li>
				<li>Ironsource</li>
				<li>Mopub</li>
				<li>Facebook Graph</li>
				<li>Google Ads</li>
				<li>Chartboost</li>
				<li>Adjust</li>
			</ul>
		</td>
		<td>169</td>
		<td>144</td>
	</tr>
	<tr>
		<td style="padding: 8px 4px;"><strong>Subway Surfers</strong><br />
			<small>v2.20.3</small><br />
			<small><strong>#5 Action</strong></small>
		</td>
		<td style="text-align:left;">
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Facebook Graph</li>
				<li>Ironsource</li>
				<li>Google Firebase Analytics</li>
				<li>Vungle</li>
				<li>AppLovin</li>
				<li>Chartboost</li>
			</ul>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Facebook Graph</li>
				<li>Ironsource</li>
				<li>Google Firebase Analytics</li>
				<li>Vungle</li>
				<li>AppLovin</li>
				<li>Chartboost</li>
			</ul>
		</td>
		<td>42</td>
		<td>41</td>
	</tr>
	<tr>
		<td style="padding: 0px;"><strong>Cash App</strong><br />
			<small>v3.44</small><br />
			<small><strong>#1 Finance</strong></small>
		</td>
		<td colspan="2" style="text-align:left;">
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Google Analytics</li>
				<li>Bugsnag</li>
				<li>AppsFlyer</li>
			</ul>
			<small><i>App Did Not Ask For Tracking Permission</i></small>
		</td>
		<td colspan="2">25</td>
	</tr>
	<tr>
		<td style="padding: 8px 4px;"><strong>DoorDash</strong><br />
			<small>v4.46.1</small><br />
			<small><strong>#1 Food &amp; Drink</strong></small>
		</td>
		<td style="text-align:left;">
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Google Firebase Analytics</li>
				<li>New Relic</li>
				<li>Segment</li>
				<li>Facebook Graph</li>
				<li>Adjust</li>
				<li>Amplitude</li>
			</ul>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Google Firebase Analytics</li>
				<li>New Relic</li>
				<li>Segment</li>
				<li>Facebook Graph</li>
				<li>Adjust</li>
				<li>Amplitude</li>
			</ul>
		</td>
		<td>170</td>
		<td>168</td>
	</tr>
	<tr>
		<td style="padding: 0px;"><strong>Peacock TV</strong><br /><small>v2.7.10</small><br />
			<small><strong>#3 Entertainment</strong></small>
		</td>
		<td style="text-align:left;">
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>New Relic</li>
				<li>Comscore</li>
				<li>Kochava</li>
				<li>Google Analytics</li>
			</ul>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>New Relic</li>
				<li>Comscore</li>
				<li>Kochava</li>
				<li>Google Analytics</li>
			</ul>
		</td>
		<td>15</td>
		<td>57</td>
	</tr>
</table>

<h4 id="total-tracking-activity-att-on-vs-off">Total Tracking Activity: ATT On Vs Off</h4>

<table style="text-align:center;">
	<tr style="font-weight:600;">
		<td></td>
		<td>Total Active Trackers Found</td>
		<td>Total Tracking Attempts</td>
	</tr>
	<tr>
		<td style="padding: 0px 8px;font-weight:600;">Ask App Not To Track</td>
		<td>50</td>
		<td>756</td>
	</tr>
	<tr>
		<td style="padding: 0px 8px;font-weight:600;">Allow Tracking</td>
		<td>50</td>
		<td>853</td>
	</tr>
	<tr>
		<td>Difference</td>
		<td style="color: red">No Difference In Total Active Trackers<br />When Asking App Not To Track</td>
		<td>97 Fewer Attempts (~13%)<br />When Asking App Not To Track</td>
	</tr>
	</table>

<h4 id="what-data-is-being-sent-to-trackers">What Data Is Being Sent To Trackers?</h4>
<p>There were a significant number of connections to third parties. Blocking these connections did not affect the functionality of the apps being used, so we looked into what data was being sent. In this section, we highlight the most notable examples. In all cases below, the test chose “Ask App Not To Track” in the ATT dialog.</p>

<p>Note that in all cases, the user’s IP address is exposed to the third party, because that is a basic requirement of making a connection to any site or server on the internet. Particularly personal or egregious data being sent are marked <span style="color: red;">in red</span>.</p>

<table style="text-align:center;">
	<tr style="font-weight:600;">
		<td>App</td><td>Tracker</td><td>Data Sent To Tracker</td>
	</tr>
	<tr>
		<td><strong>Grubhub</strong></td>
		<td>
			<strong><a href="https://braze.com">Braze</a></strong>
			<br />
			<i>Marketing</i>
		</td>
		<td>
			<ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li style="color: red;">First Name</li>
				<li style="color: red;">Last Name</li>
				<li style="color: red;">Location (Exact Long/Lat)</li>
				<li>Cellular Carrier Name (E.g, "AT&amp;T")</li>
				<li>Screen Resolution</li>
				<li>User Locale</li>
				<li>Time Zone</li>
				<li>iPhone Model</li>
				<li>iOS Version</li>
				<li>Behavioral (Taps, Page Views)</li>
			</ul>
		</td>
	</tr>
  <tr>
    <td><strong>Subway Surfers</strong></td>
    <td>
      <strong><a href="https://vungle.com">Vungle</a></strong>
      <br />
      <i>Ad Network</i>
    </td>
    <td>
      <ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li>Cellular Carrier Name (E.g, "AT&amp;T")</li> 
				<li style="color: red;">Free Storage Space (bytes precision)</li>
				<li style="color: red;">Current Battery Level (15 decimals precision)</li>
				<li style="color: red;">Current Volume Level (3 decimals precision)</li>
				<li>Time Zone</li>
				<li>User Locale</li>
				<li style="color: red;">Battery Charging State (E.g, "Plugged In")</li>
				<li>Connection Type (E.g, "Cellular")</li>
				<li>Connection Type Detail (e.g, "LTE")</li>
				<li>iPhone Model (E.g, "iPhone X")</li>
				<li>iOS Version</li>
				<li>Language</li>
				<li>User Agent (Browser Agent)</li>
				<li>Screen Resolution</li>
      </ul>
    </td>
  </tr>
  <tr>
    <td><strong>Run Rich 3D</strong></td>
    <td>
      <strong><a href="https://chartboost.com">Chartboost</a></strong>
      <br />
      <i>Ad Network</i>
    </td>
    <td>
      <ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li style="color: red;">Device Name (e.g, "John's iPhone X")</li> 
				<li style="color: red;">Accessibility Setting: Bold Text</li> 
				<li style="color: red;">Accessibility Setting: Custom Text Size</li> 
				<li>Display Setting: Dark Mode</li>
				<li>Screen Resolution</li>
				<li>Time Zone</li>
				<li>Total Storage Space (bytes precision)</li>
				<li style="color: red;">Free Storage Space (bytes precision)</li>
				<li>Currency (e.g, "USD")</li>
				<li>iOS Version</li>
				<li style="color: red;">Audio Output (e.g, "Speakerphone"/"Bluetooth")</li>
				<li style="color: red;">Audio Input (e.g, "iPhone Microphone")</li>
				<li style="color: red;">Accessibility Setting: Closed Captioning</li>
				<li>Country</li>
				<li>Cellular Carrier Name (E.g, "AT&amp;T")</li>
				<li>Cellular Carrier Country</li>
				<li style="font-weight: 600; color: red;">Last Restart Time (Exact Timestamp, Second Precision)</li>
				<li style="color: red;">Calendar Type (E.g, "Gregorian")</li>
				<li style="color: red;">Enabled Keyboards (E.g, "English, Emoji, Arabic")</li>
				<li style="color: red;">Current Battery Level (15 decimals precision)</li>
				<li style="color: red;">Current Volume Level (3 decimals precision)</li>
				<li style="color: red;">Accessibility Setting: Increase Contrast</li>
				<li style="color: red;">Current Screen Brightness (15 decimals precision)</li>
				<li>Portrait/Landscape Mode</li>
				<li style="color: red;">Battery Charging State (E.g, "Plugged In")</li>
				<li>iPhone Model (E.g, "iPhone X")</li>
				<li>Language</li>
				<li>User Agent (Browser Agent)</li>
      </ul>
    </td>
  </tr>
  <tr>
    <td><strong>Cash App, Grubhub</strong></td>
    <td>
      <strong><a href="https://appsflyer.com">AppsFlyer</a></strong>
      <br />
      <i>Marketing</i>
    </td>
    <td>
      <ul style="text-align: left; margin-bottom: 0px; font-size: 0.85em; margin-left: 18px;">
				<li style="color: red;"><strong>All data sent by this tracker was encrypted, and the client SDK is closed source. There is no way to determine how much personal user data is being sent to AppsFlyer.</strong></li>
      </ul>
    </td>
  </tr>
</table>
<h2 id="discussion">Discussion</h2>
<p>In our tests of ten top-ranked apps, we found no meaningful difference in third-party tracking activity when choosing App Tracking Transparency’s “Ask App Not To Track”. The number of active third-party trackers was identical regardless of a user’s ATT choice, and the number tracking attempts was only slightly (~13%) lower when the user chose “Ask App Not To Track”.</p>

<p>How do these results compare to Apple’s App Tracking Transparency claims? In our earlier section, we observed that the average iOS user will see ATT’s claims that “all new app tracking requests are automatically denied” and that a user can ask the app to not track their “activity across other companies’ apps and websites.”</p>

<p>Neither of Apple’s claims about App Tracking Transparency hold up. New app tracking requests were <em>not</em> automatically denied, and even though it is technically correct that the user can <em>ask</em> apps to not track their activity across other companies’ apps and websites, in our tests, in no case did apps respect this request. And since every connection exposes the user’s IP address, it’s trivial for these third parties to uniquely identify users, rendering ATT useless.</p>

<p>In many cases, trackers received much more than the user’s IP address. We observed that they received an egregious amount of data, including a user’s accessibility settings, the exact time that the user last restarted the device (down to the second), the device’s current battery level and screen brightness (both with a precision of 15 decimals), the user’s exact latitude and longitude, installed keyboards, time zone, current speaker and microphone settings, currency, volume level, and much much more — see the “What data is being sent to trackers?” section above for full examples.</p>
<h3 id="apples-massive-loophole-a-narrow-definition-of-tracking">Apple’s Massive Loophole: A Narrow Definition of Tracking</h3>
<p>How could Apple have failed so miserably in stopping third party trackers with a feature named “App Tracking Transparency”? Digging into the answers for this question led us to discover the main cause: Apple’s narrow definition of the term “tracking”.</p>

<p>In a way, the definition of tracking <em>can</em> be simple and intuitive: Tracking is when an app unnecessarily sends your data to third parties. That third party has now tracked you, because they have, at a minimum, your IP address: the simplest and oldest identifier on the web. From your IP address, your location can be approximated fairly reliably, along with who your ISP is. And since it’s an identifier that stays the same for most home internet connections, it’s a pretty easy way to track you across the web.</p>

<p>Instead, Apple has hijacked the term “tracking” to define it as something highly specific, and they’ve even placed their full definition of it in developer documentation, which of course no average iOS user will ever read. So what is it? It turns out that Apple interpretation of “tracking” is that it must fulfill <em>all</em> of these conditions: First, it must link user data from one app/website to another app/website. Second, it must do this specifically for targeted advertising or advertising measurement purposes. Third, Apple excludes a list of so-called acceptable tracking behaviors that are not considered “tracking”.</p>

<p>Based on our research, we found Apple’s definition of tracking to be misleading, counterintuitive, and confusing for these reasons:</p>

<p>1) <strong>It is too narrow in scope</strong>: There are obvious cases outside of advertising where a reasonable person would consider a behavior “tracking”. For example, <em>if someone was tracking your location in real-time using spyware embedded in an app, Apple would not consider this “tracking”, because it is not “advertising” related.</em></p>

<p>2) <strong>It contains too many caveats</strong>: Buried in developer documentation, Apple maintains a list of tracking behaviors that are not considered “tracking”. While these exceptions might <em>sound</em> reasonable, most of them are vague enough that they can be reinterpreted by even the most egregious trackers to fit in these exclusions. For example, since “fraud detection” is such an expansive exception, marketing trackers can (and do) claim that they are sending dozens of different signals and data from a user’s device, not to uniquely identify the user for advertising, but for the purposes of so-called “fraud detection”. In fact, during our research, we found a case where a third party’s software (Kochava) labeled itself “AppleTracker”, even though Apple definition would not consider it “tracking”.</p>

<p>3) <strong>It relies too heavily on trusting the very tracking companies that the policies are supposed to be protecting users against</strong>: Apple’s definition allows apps to secretly send any and all of your data to third parties, and as long as those third parties <em>publicly claim</em> they won’t link your data to other sites or sell it, it’s not considered “tracking” by Apple. It is a 100% trust-based honor system, which means that the only way for these companies to get caught “tracking” is to literally pen a public confession of guilt or wrongdoing — something that profit-driven companies are not exactly known for doing.</p>

<p>4) <strong>It incentivizes less transparency, creating more dangers for privacy</strong>: Apple’s tracking definition makes the privacy-regressive assumption that third-party connections are by default innocuous, so any enforcement would require that someone provides definitive proof of abuse. But what if these third-party connections are intentionally obscured or encrypted? Our testing found that Apple allows the closed-source AppsFlyer tracker to encrypt the data that it sends from the user’s device, so that nobody can see exactly what AppsFlyer is receiving. The danger here is that third party trackers can access anything that its host app can access. For example, for Square’s Cash App, as unlikely as it may sound, it’s entirely possible that their embedded AppsFlyer tracker is sending all of your financial transactions to their own third party servers, and nobody other than AppsFlyer — <em>not even Apple</em> — would know.</p>

<h3 id="to-catch-a-tracker-we-must-think-like-a-tracker">To Catch A Tracker, We Must Think Like A Tracker</h3>
<p>But what if Apple is right to trust third-party tracking companies, and all the tracking companies <em>really are</em> trying their best to protect user privacy, even at the expense of their own profits? To test this theory, we “went undercover” by spending a few hours as an app developer who wanted to circumvent Apple’s own tracking rules when users choose ATT’s “Ask App Not To Track”. Would the tracking services allow us to do this, or would we be proven right?</p>

<p>It turns out, we weren’t wrong. We were actually more right than we thought.</p>

<p>Not only do these trackers <em>allow</em> their clients to break Apple’s rules, but they <strong>specifically built features to help their clients easily circumvent Apple’s ATT privacy rules</strong>.</p>

<p>First, we created a dummy app that used the Kochava tracking service. With just a few clicks, we configured Kochava to violate Apple’s “ATT Opt-Out” by asking it to tracking users across apps (using “IP address” and “User Agent”) for the purpose of ad targeting (“Paid Media”). Basically, Kochava made it really convenient for any app developer to violate even Apple’s narrow definition of tracking.
<img src="https://privacyreview-site-assets.s3.amazonaws.com/images/Kochava.jpg" alt="Screenshot of our dummy app's settings in Kochava, showing that we configured it to explicitly violate Apple's ATT rules." /></p>

<p>We later performed the same test with the AppsFlyer tracking service (which, as previously mentioned, hides the data it sends off your device), and it was <em>even easier</em> to enable “privacy cheat mode” and track users against their consent — all it took was clicking a single button.</p>

<p><img src="https://privacyreview-site-assets.s3.amazonaws.com/images/IMG_3093.png" alt="Screenshot of equivalent situation in AppsFlyer, showing it's only one button to turn off user privacy." /></p>

<p>It’s important to note that since these “cheat ATT” settings are configured on the third party tracker’s dashboard, there is a zero percent chance of Apple finding out that apps are engaging in this behavior, because Apple does not have access to it. <em>Achievement unlocked: Illusion of User Privacy.</em></p>
<h2 id="conclusion">Conclusion</h2>
<p>When it comes to stopping third-party trackers, App Tracking Transparency is a dud. Worse, giving users the option to tap an “Ask App Not To Track” button may even give users a false sense of privacy: users who would have otherwise been more cautious with giving their data to an app might let their guard down, thinking that they’re “safe” from third-party tracking.  Furthermore, we found that some apps didn’t even bother to show the ATT dialog, despite contacting numerous third-party trackers.</p>

<p>The core problem is that App Tracking Transparency is entirely based on the honor system, so it suffers the <a href="https://www.fastcompany.com/90591586/apple-privacy-nutrition-labels-flaws">same fatal flaw as Apple’s “Privacy Nutrition Facts”</a>. App developers can choose whether or not to be honest about tracking, and if all their competitors are lying, why would they choose to be honest? Since the App Store has millions of apps, slipping by the rules is not only easy, but as our testing showed, it’s the norm.</p>

<p>Apple also doesn’t have an incentive to police the very profitable App Store (~20% of revenue). Sure, they’ll shut down <a href="/2020/11/25/how-to-make-80000.html">no-name scams when they go viral</a>, or apps that <a href="https://en.wikipedia.org/wiki/Epic_Games_v._Apple">publicly tout rule violations</a>. But what are the odds that Apple reprimands Yelp (partnership with Apple Maps), or DoorDash (#1 Food Delivery App), or popular games (<a href="https://twitter.com/rjonesy/status/1436372845458771969">98% of App Store revenue</a>)? And with enough users, apps can even gain power over the “walled garden”, lest their users switch to Android or Huawei. For example, WeChat was <a href="https://reclaimthenet.org/apple-app-store-wechat-china/">allowed to violate</a> rules because China is a critical market for Apple, and <a href="https://www.techspot.com/news/71289-apple-granted-uber-ios-app-entitlement-allowed-record.html">Uber was given a unique privilege</a> that let it record users’ screens.</p>

<p>So what should Apple do?</p>

<p>Trackers can’t send data that they don’t have access to, so the most direct technical fix is to limit or eliminate app access to information that is used to <a href="https://ssd.eff.org/en/module/what-fingerprinting">fingerprint devices</a>. In iOS 15, apps are allowed unlimited access to device data that are totally irrelevant to their functionality. For example, why would any app need access to the <a href="https://developer.apple.com/forums/thread/101874?answerId=309633022#309633022">exact</a> <a href="https://web.archive.org/web/20210118025402/https://developer.apple.com/forums/thread/101874?answerId=309633022#309633022">second</a> that the user last restarted their iPhone? And why does iOS give apps access to this data in such a high degree of precision? Data such as an iPhone’s <a href="https://developer.apple.com/documentation/uikit/uidevice/1620042-batterylevel">remaining battery</a> and <a href="https://developer.apple.com/documentation/uikit/uiscreen/1617830-brightness">screen brightness</a> (both accurate to <a href="https://developer.apple.com/documentation/coregraphics/cgfloat">15 decimals</a>), or an iPad’s <a href="https://developer.apple.com/documentation/foundation/nsfilesystemfreesize">remaining free space</a> (down to the byte), serve no legitimate, <a href="https://nshipster.com/device-identifiers/">non-fingerprinting purpose</a> for most apps. And for the rare app that might need this level of precision, App Review should approve the usage on a case-by-case basis.</p>

<p>Apple also needs to take a hard line against closed-source trackers — especially the ones that further encrypt the data they’re sending to third-party servers. Trackers have access to everything its host app has access to: Trackers in your finance app have access to all your financial data, trackers in health apps have access to all your health data, and trackers in your photos apps have access to all your photos. And the only way to ensure the tracker isn’t stealing any of this, is for the tracker to reveal their source code. Without that, the “walled garden” concept is a total sham, because even Apple’s App Review team has zero visibility into what data is being sent to third parties.</p>

<p>On the marketing and user interface side, Apple needs to come clean that App Tracking Transparency is a completely trust-based system, and possibly even rename the feature itself. It’s plain dishonest to show users a dialog that gives the option of choosing “Ask App Not To Track”, while Apple (a company that excels in user interface and design) knows full well that 1) the average user will be misled into thinking they’re protected from tracking and that 2) in practice, there is very little to no compliance or effect on third-party tracking.</p>

<p>In the Settings app, Apple needs to be extremely clear that iOS currently does not and cannot stop third-party tracking. Before iOS 14.5, every app permission (Camera, Contacts, etc) in the Privacy panel has always been enforced by iOS, ensuring that certain apps can or can’t access certain features. iOS 14.5’s Tracking permission breaks this ten-year-old iOS pattern and misleads users into thinking that it’s enforced like every other permission. In fact, iOS even claims something completely untrue here: that “new app tracking requests are automatically denied.”</p>

<p>Finally, over the long term, Apple should be more transparent and model transparency for developers by open sourcing not only their software, but also their processes. What <em>exactly</em> does Apple or any app developer do with a user’s data when it’s uploaded to the cloud? What trackers are embedded in an app? How does Apple’s App Review verify an app’s privacy promises?</p>

<p>In a world where software and processes are truly transparent, these and other pressing privacy questions can be answered accurately and verified easily. Absent that, a billion iPhone users around the world are stuck with an unacceptable compromise: relying on independent researchers to, in their spare time, test and report on the effectiveness of a trillion dollar company’s flagship features.</p>

<p><br />
<br /></p>]]></content><author><name>Lockdown Privacy</name></author><summary type="html"><![CDATA[Does it stop third-party tracking? Or is it just an illusion of privacy?]]></summary></entry><entry><title type="html">Introducing OpenAudit: Forget Privacy Policies, Get Privacy Proof</title><link href="https://blog.lockdownprivacy.com/2021/05/26/april-2021-openaudit-of-lockdown-privacy.html" rel="alternate" type="text/html" title="Introducing OpenAudit: Forget Privacy Policies, Get Privacy Proof" /><published>2021-05-26T00:00:00+00:00</published><updated>2021-05-26T00:00:00+00:00</updated><id>https://blog.lockdownprivacy.com/2021/05/26/april-2021-openaudit-of-lockdown-privacy</id><content type="html" xml:base="https://blog.lockdownprivacy.com/2021/05/26/april-2021-openaudit-of-lockdown-privacy.html"><![CDATA[<h4 id="to-see-exactly-what-your-apps-are-doing-with-your-data-ask-for-an-openaudit">To see exactly what your apps are doing with your data, ask for an OpenAudit.</h4>
<!--more-->
<div style="margin-x:auto; text-align:center; margin-top: 10px; margin-bottom: 10px;">
	<img src="/assets/images/oa-logo.png" alt="Logo for OpenAudit. Yellow circle with the word OpenAudit to the right of it.)" style="height: 80px;" />
</div>

<p>Apps have a responsibility to protect the privacy of user data, and to secure it against external and internal threats. But apps often just make up whatever privacy claims sounds good, and place it in their marketing materials, Privacy Policy, and Apple’s self-reported <a href="/2020/12/18/Apples-Privacy-Nutrition-Facts.html">Privacy Nutrition Facts</a>. This leads to data <a href="/2020/12/02/why-you-cant-trust.html">hacks, leaks, and even theft</a>.</p>

<p>OpenAudit is a standardized way of <em>proving</em> these claims, instead of just asserting them. Here is a <a href="https://openaudit.com/tutorial">simple tutorial</a> on how it works. A claim must have <strong>references</strong> (either specific lines of code, or relevant documentation). Auditors then perform <strong>verifications</strong> on each reference to ensure they adequately support the claim. More relevant to Lockdown Privacy users, we also conducted an <a href="https://openaudit.com">OpenAudit</a> in April 2021. Here’s a snippet:</p>

<p><img src="/assets/images/oa-2-email.png" alt="Screenshot of a the same text document, but now there is a popover that has 3 citations/proof entries right beneath the text that was previously pointed to. The first proof is a Github code snippet with actual source code, second is the wikipedia entry on Advanced Encryption Standard, and third is another code snippet from Github. Under each entry are two &quot;VERIFIED&quot; labels with the usernames of the security auditors who verified each entry." /></p>

<p>In this example, the reader clicked the claim that “user data […] is protected by modern encryption”, which shows a popup with the proof of that claim: three references that support it, and two auditor verifications per reference. OpenAudit is designed to show everyday users which claims have been independently verified, while allowing technical users to quickly dig into the details. Lockdown Privacy’s OpenAudit has a total of 582 references and 1164 verifications, all publicly viewable at <a href="https://openaudit.com">https://openaudit.com</a>.</p>

<p>Today, users are forced to blindly trust that their apps (even privacy apps) won’t steal or leak their data. But how can you tell which apps actually respect your privacy, and which apps are just using slick marketing and making false promises? In an App Store plagued with fraudulent, scammy, and negligent apps, OpenAudit lets honest apps stand out by earning user trust through independently verified proof.</p>

<p>OpenAudit is <a href="https://github.com/OpenlyOperated">open source</a> and developers can use it to to audit their own apps for free at <a href="https://openaudit.com">openaudit.com</a>. And if you want to work on this full-time, we’re hiring! Reach out to <a href="mailto:work@openaudit.com">work@openaudit.com</a> to join our mission of using transparency to make apps, services, and the web a safer and more trustworthy place for everyone.</p>]]></content><author><name>Lockdown Privacy</name></author><summary type="html"><![CDATA[To see exactly what your apps are doing with your data, ask for an OpenAudit.]]></summary></entry><entry><title type="html">Apple’s “Privacy Nutrition Labels” Have A Fatal Flaw</title><link href="https://blog.lockdownprivacy.com/2020/12/18/Apples-Privacy-Nutrition-Facts.html" rel="alternate" type="text/html" title="Apple’s “Privacy Nutrition Labels” Have A Fatal Flaw" /><published>2020-12-18T00:00:00+00:00</published><updated>2020-12-18T00:00:00+00:00</updated><id>https://blog.lockdownprivacy.com/2020/12/18/Apples-Privacy-Nutrition-Facts</id><content type="html" xml:base="https://blog.lockdownprivacy.com/2020/12/18/Apples-Privacy-Nutrition-Facts.html"><![CDATA[<h4 id="app-privacy-should-give-users-verified-information-not-a-false-sense-of-security">“App Privacy” should give users verified information, not a false sense of security</h4>
<!--more-->

<p><em>This story was featured and re-posted by FastCompany. You can go <a href="https://www.fastcompany.com/90591586/apple-privacy-nutrition-labels-flaws">read it here</a>.</em></p>

<p>With the recent release of iOS 14, Apple enabled a new feature called “App Privacy” (or what they call <a href="https://www.seattletimes.com/business/technology/new-from-apple-at-wwdc-hand-washing-alerts-iphone-widgets-and-privacy-nutrition-labels/">Privacy Nutrition Labels</a>) in the App Store, which supposedly shows users what information apps collect, and how it’s used. For example, the Facebook app’s extremely long App Privacy section, which details all the information they collect, is already the subject of viral tweets such as this one:</p>

<p><img src="/assets/images/macrumortweet.jpg" alt="MacRumors tweet about Facebook app privacy - finger cramping from scrolling so much" /></p>

<p>Most people are already aware that Facebook has terrible privacy practices, but Apple still deserves a lot of credit for exposing Facebook so publicly on their official platform. Raising awareness about privacy is terrific, and this is definitely the right direction. So what’s the catch?</p>

<p>The problem with Apple’s App Privacy is that it’s entirely self-reported. The app developer gets to make whatever privacy claims they want, and none of that information is vetted. There’s no verification by Apple or by any other source.</p>

<p>App Privacy is not new. It’s re-branding and simplification of the Privacy Policy, aka the “We Pinky-Promise to Not Steal Your Data” document. Unfortunately, App Privacy doesn’t fix the Privacy Policy’s inherent and critical flaw: Privacy Policies contain no proof of the privacy claims they make.</p>

<p>Apple doesn’t verify any of the App Privacy information that app developers submit - because they <em>can’t</em>. There is currently no way for Apple to know what an app does with user data after the data is sent to the app. But by calling it equivalent to “Privacy Nutrition Labels”, Apple irresponsibly implies that this privacy information is vetted, when that is absolutely false.</p>

<p>This results in two unintended consequences: it creates a false sense of security for users, and an incentive for more dishonest and privacy-invasive apps in the App Store.</p>
<h3 id="a-false-sense-of-security-for-users">A False Sense of Security for Users</h3>
<p>The Privacy Policy, and by extension, App Privacy, has been a failure due to its inaccuracy and lack of reliability. This is partially because even the app developers themselves may not know what user data is being given to third parties, or who those third parties give user data to.</p>

<p>One example of this is a recent privacy scandal involving the <a href="https://9to5mac.com/2020/11/20/us-military-buys-location-data-from-muslim-prayer-app-and-more/">mass selling of user data</a> to the U.S. military, with location data harvested from various apps - a Craigslist app, a Muslim prayer app, weather apps, and many others. This was possible because these apps used a third-party integration that sold location data. And since the app developer didn’t even know the third-party integration was doing this, they of course didn’t mention it in their apps’ Privacy Policies (or App Privacy) - they can’t include what they don’t even know.</p>

<p>Another example is the case of poor security practices, resulting in security breaches. It seems like every week, some company “regrettably” announces that <a href="https://haveibeenpwned.com/PwnedWebsites">they’ve been hacked</a>. Last week, it was SolarWinds, who apparently set their server password to “<a href="https://www.reuters.com/article/global-cyber-solarwinds/hackers-at-center-of-sprawling-spy-campaign-turned-solarwinds-dominance-against-it-idUSKBN28P2N8">solarwinds123</a>”. Negligent, cheap, and lazy security practices like this are commonplace, and are opaque to even the most detailed Privacy Policies.</p>

<p>Both real-world examples above seriously impact user data and privacy, and unfortunately in both cases, App Privacy doesn’t help, and worse, may give users a false sense of safety.</p>
<h3 id="incentivizing-dishonest-and-privacy-invasive-apps">Incentivizing Dishonest and Privacy-Invasive Apps</h3>
<p>The App Store ecosystem is a competitive place. For every app, there are at least two or three apps with similar functionality competing for the same users. And you don’t need five email apps - you need one good one. Users choose apps based on many factors: features, design, screenshots, reviews, and now with iOS 14, App Privacy. So to win the most users, developers are now incentivized to make their App Privacy look good. Key phrase: “<strong>look</strong> good”.</p>

<p>Again, App Privacy is based on Privacy Policies, so it relies on the app developer to be honest - it’s like asking restaurants to do their own health inspections and provide their own health scores. Now that App Privacy makes the Privacy Policy much more prominent, how does this affect the incentive structure for App Store apps?</p>

<p>Let’s say you’re choosing between two email apps on the App Store - both seem similar in features and design. Unbeknownst to you, however, one email app is created by a dishonest developer who intends to extract extra profit by selling your emails to third parties, while the other email app is honest and does not do this. Which app do you end up choosing?</p>

<p><img src="/assets/images/scamapp.jpg" alt="A table showing the incentive structure that Apple has created with App Privacy." /></p>

<p>In this situation, both email apps collect basic analytics. The dishonest app, however, writes in their App Privacy that they don’t collect or sell <em>any</em> data, while the honest app admits that they collect basic analytics. So you read the App Privacy for both apps, and decide that since you want to “maximize privacy”, you download the dishonest app - the one that secretly sells your emails to third parties. It’s not your fault - it’s the fault of a poor incentive structure.</p>

<p>This results in a nightmare feedback loop: Dishonest apps make more money due to their willingness to lie on their App Privacy, and then use their ill-gotten profits to buy Apple’s App Store Search Ads, which allows them to appear first in search results and rope in more downloads and more user data. Sell the user data, rinse and repeat. I previously wrote about the magnitude of top-selling apps doing exactly this on the App Store <a href="/2020/11/25/how-to-make-80000.html">here</a>. The App Store’s “scam apps” problem hasn’t gotten better since then, and the introduction of App Privacy will now help them seem even more legitimate than ever before to unsuspecting users.</p>

<h3 id="finding-apps-that-truly-respect-privacy">Finding Apps That Truly Respect Privacy</h3>
<p>So what can be done about App Privacy’s ease of abuse?</p>

<p>Apple has said that developers caught lying on their App Privacy will be banned, but this threat has no teeth. First, as mentioned earlier, it’s impossible for Apple to catch liars because Apple has no way of knowing if app developers are telling the truth about privacy - this threat is only effective against the most visible companies like Facebook, who are already under heavy scrutiny. Second, Apple is not financially incentivized to eliminate profitable apps from the App Store (since they take 30% of revenues as well as App Store Ads), and other than specific removals of a few scandals that go viral in the media, they aren’t spending the time or resources to individually verify the 2 million apps on the App Store.</p>

<p>Luckily, you don’t need to depend on Apple. Here how to find and choose truly privacy-respecting apps:</p>

<p>First, look for apps that are 100% open source. Open source is the “organic” of software, and it means the app’s code is publicly visible, so there’s nothing to hide. There are no unknown third-party integrations, and everything the app does, including collection of data and how it’s used, is accessible by everyone. Importantly, ensure that not just the app, but also the app’s servers (where your data is stored and transferred in the cloud) are 100% open source.</p>

<p>Second, check a site like <a href="https://privacyreview.co">Privacy Review</a> for neutral third party analyses on the tracking behaviors of specific apps. Instead of trusting App Store’s App Privacy, which is self-reported, these sites use tools like <a href="https://lockdownprivacy.com">Lockdown Privacy</a> to see exactly what connections to trackers are made - it’s like a Snopes, but for apps. Watch out, though, for review sites that get a “referral bonus” from your app signups or downloads - these are almost always scams, because they only get paid when you purchase the app.</p>

<p>Third, make sure that management and ownership of the company is clear. Who is the CEO? Where are they located and are they a real company? Or are they a series of offshore shell companies that allow the owners to stay anonymous? You’ll be surprised how often that last one is true, especially for so-called privacy apps that are <a href="https://www.techspot.com/news/60828-popular-free-vpn-service-hola-dodgy.html">malware</a> or <a href="https://9to5mac.com/2017/08/07/hotspot-shield-snooping-on-users-vpn/">data-mining</a> companies <a href="https://techcrunch.com/2020/12/16/australia-sues-facebook-over-its-use-of-onavo-to-snoop/">in disguise</a>.</p>

<h3 id="final-note">Final note</h3>

<p>Apple’s App Privacy creates a heavily-manipulated <em>illusion</em> of transparency, without any of the benefits of true transparency. It gives financial incentives for apps to be more dishonest, and Apple would be well-advised to change course on this for the health of the App Store ecosystem and their 1.5 billion customers.</p>

<p>App Privacy has a lot of potential. It shouldn’t just be a watered-down Privacy Policy that misleads users. It should instead adopt a verifiable transparency standard like <a href="https://openlyoperated.org">Openly Operated</a>, which puts the responsibility on the companies to prove their security and privacy claims before being allowed to access user data.</p>

<p>In the meantime, we advise that you (and your family and friends) to take App Privacy with a heavy grain of salt, because it’s not at all a dependable indicator of trustworthiness - and may simply indicate an app developer’s willingness to lie.</p>]]></content><author><name>Lockdown Privacy</name></author><summary type="html"><![CDATA[“App Privacy” should give users verified information, not a false sense of security]]></summary></entry><entry><title type="html">The “Do Not Sell My Personal Data” Button Makes Absolutely No Sense</title><link href="https://blog.lockdownprivacy.com/2020/12/09/heres-what-the-do-not.html" rel="alternate" type="text/html" title="The “Do Not Sell My Personal Data” Button Makes Absolutely No Sense" /><published>2020-12-09T08:00:00+00:00</published><updated>2020-12-09T08:00:00+00:00</updated><id>https://blog.lockdownprivacy.com/2020/12/09/heres-what-the-do-not</id><content type="html" xml:base="https://blog.lockdownprivacy.com/2020/12/09/heres-what-the-do-not.html"><![CDATA[<h4 id="are-there-people-who-want-their-data-to-be-sold">Are there people who <em>want</em> their data to be sold?</h4>
<!--more-->

<p><img src="/assets/images/1*UyU_XbnAcNwlrD5WMqHWIw.jpeg" alt="" /></p>

<p>A few years ago, people started realizing that their personal data (browsing activity, IP address, etc) were being monetized and/or exploited for profit. To combat this, the California Consumer Privacy Act (CCPA) was passed, which said that companies have to let users say no to the sale of their personal data. Other states are now passing similar bills.</p>

<p>For the uninitiated, personal data like IP addresses and other unique identifiers are used by advertisers, data brokers, and tracking companies to build a shadow profile of you, and to follow you across websites. That’s why if you’re shopping for blue hiking shoes, and then went to read the news on different site, you’ll see ads for blue hiking shoes on the news site. The fact that everyone has a unique IP makes it the most common and easily exploitable personal data.</p>

<p>So this new law passes, and now sites slap on a “Do Not Sell My Data” button to appease the regulators. What do they do? Do they work? I clicked a bunch of ‘em to find out:</p>

<p><img src="/assets/images/1*KXNhPPsLCP6EC4CTxEyAIQ.png" alt="Six sites with the &quot;Do Not Sell My Data&quot; button at the site footer." /><em>Six sites with the “Do Not Sell My Data” button at the site footer.</em></p>

<p>For shopping sites Bloomingdale’s, Nordstrom, and Lowe’s, the “Do Not Sell My Info” links lead to forms where, ironically, you’re required to give <em>more</em> of your personal data in order to proceed to the next step. If you’re someone who is concerned about your data being sold, why would you give away your full name, emails, phone number, home address, and credit card numbers to <em>another third party</em>? What are they going to do with this info?</p>

<p><img src="/assets/images/1*rDj7_6vfPzUpgdcVsMXCUQ.png" alt="How many companies are in the “Intel Alliance”? Who is OneTrust and what happens if Google acquires them? And WTF is “service-now.com”?" /><em>How many companies are in the “Intel Alliance”? Who is OneTrust and what happens if Google acquires them? And WTF is “service-now.com”?</em></p>

<p>On <a href="https://www.similarweb.com/website/thehill.com">popular</a> news site “The Hill”, clicking “Do Not Sell My Data” shows a dialog that links to the privacy policies of the 32 advertising and tracking companies they work with. The “Save &amp; Exit” button makes no sense because there are no options to save, and clicking it does nothing. The “Do Not Sell My Data” button dismisses the dialog without any visible response. Did that… do something? Do I have to click that every time I visit the site? And am I expected to read and stay updated with all 32 privacy policies just to read the news?</p>

<p><img src="/assets/images/1*j60jfkKvB5LHRfYsyfFzEg.png" alt="A sort of “Forbes 30 under 30” of tracking and ad companies. Congrats to all who made it." /><em>A sort of “Forbes 30 under 30” of tracking and ad companies. Congrats to all who made it.</em></p>

<p>Clicking “Do Not Sell My Data” buttons on other sites, other issues surfaced:</p>

<p>First, the opt-out forms apparently need to be completed using every browser and device I use to access the apps or sites. So the number of times I have to do this process is (number of sites &amp; apps) * (number of devices) * (number of browsers). The time wasted ends up getting pretty huge, pretty quickly.</p>

<p><img src="/assets/images/1*ywnebVXKJ9ud_I3yoO0qpg.png" alt="Big “getting websites to stop selling my info” Mood. [Source](https://www.biblio.com/the-myth-of-sisyphus-and-by-camus-albert/work/2720)" /><em>Big “getting websites to stop selling my info” Mood. <a href="https://www.biblio.com/the-myth-of-sisyphus-and-by-camus-albert/work/2720">Source</a></em></p>

<p>Second, I learn that every time I clear my cookies (which are another way websites track users), I have to redo everything. So if I attempt to reduce tracking by clearing my cookies, I’m actually also implicitly agreeing to let companies sell my personal data, and I have to opt-out on every site again. 🤦🏻‍♂️</p>

<p>Why do we have to find and click a tiny button, fill out forms, give more personal data, and jump through whatever hoops on every single site and app, on every device and browser, just to not have our personal data sold to third parties? Should we also have to specifically tell every restaurant we dine at to not spit in our food? Or have to specifically tell every plumber we hire to not steal from our homes?</p>

<p>Obviously, nobody wants their personal data to be sold, so a more reasonable system would be for every website to <em>by default</em> not sell users’ personal data, instead of requiring users to opt-out.</p>

<p>So if we changed the law to simply penalize companies that sold user data without explicit consent, would that work better?</p>

<p>Nope, that wouldn’t work either — for three reasons:</p>

<p>First, tracking companies are already <a href="https://www.reuters.com/article/us-usa-retail-privacy/do-not-sell-my-info-u-s-retailers-rush-to-comply-with-californias-new-privacy-law-idUSKBN1YY0RK">challenging</a> what “sale” in “sale of data” means — arguing that many instances are a gray area. For example, if TRACKERS-R-US paid ACME App to add a tracker, does this count as “sale”? A high-powered attorney could argue that it’s not, because the data goes directly from users to TRACKERS-R-US, so it’s not owned by ACME — and you can’t sell what you don’t own. Or what if ACME doesn’t get cash, but instead gets a share of revenue or some service in return? Or what if TRACKERS-R-US is branded as a “analytics tool” that’s “crucial” to the functioning of ACME App? These nitpicky questions may seem inane and stupid to you, but the fact that there’s even some tiny chance of ambiguity can create years of litigation and appeals, because companies, like anyone else, are innocent until proven guilty. In these cases, companies don’t need to win — they simply need to drag out the legal battles as long as possible (see Uber).</p>

<p>Second, enforcing laws about what companies should do internally is nearly impossible, because catching violations relies on self-reporting, and also because many violators are outside of the law’s jurisdiction. There’s simply no scalable way to know if companies are selling user data. If a company claims they don’t sell user data, but does it anyway, they’ll get away with it 99.999% of the time, because the only people that know about the violation are themselves. Add a few more 9’s if the company is based outside the USA.</p>

<p><img src="/assets/images/1*R5RUr066P2HIJWXd2tEskQ.png" alt="I thought Facebook might have put their “Do Not Sell” link under the “More” menu. Nah." /><em>I thought Facebook might have put their “Do Not Sell” link under the “More” menu. Nah.</em></p>

<p>Third, some companies just don’t give a flying f about what the laws say, because they can easily afford the <a href="https://www.businessinsider.com/facebook-stock-rose-news-5-billion-ftc-settlement-why-critics-2019-7">fines</a>, and because the political climate doesn’t exactly lend itself to serious regulatory action against mega-corporations. While companies like CNN and Wal-Mart make attempts to comply by adding “Do Not Sell” links, Facebook (who collects more personal data than anybody else) has completely ignored it, choosing instead to wait for the next slap on the wrist.</p>

<p>If you want companies to not sell your personal data, we know what <em>doesn’t work</em>: It doesn’t work to spend all your free time clicking “Do Not Sell” buttons that probably do nothing, and it doesn’t work to pass regulations that are ultimately ignored or rely on ineffective self-policing.</p>

<p>So what <em>would</em> work?</p>

<p>Let’s go to the source: Sites and apps you <em>want</em> to use are simultaneously serving you third-party trackers that you <em>don’t want</em>. Remember news site The Hill from earlier and their 32 tracking companies? You *want *the news, but you *don’t want *the third-party data sharing. And if it’s an app (instead of a site), this tracking can even happen in the background, when the app isn’t even open.</p>

<p>The simple solution that we (two ex-Apple engineers) came up with is to directly block the trackers, so that your personal data doesn’t get out to third parties in the first place. This is way more effective than allowing tracking and hoping that apps and sites don’t later sell your info.</p>

<p>We built a free and open source app that you can <a href="https://lockdownprivacy.com">you can download right now</a> called Lockdown, and it blocks trackers, ads, and badware in not just your browsers, but <em>all apps.</em> So you don’t have to stop reading the news, online shopping, or playing games— just install Lockdown, push a button to activate it, and then go on living your life — we take care of the rest. Here’s what it looks like:</p>

<p><img src="/assets/images/1*699ZuPFQ9Qc0aOcdJifXIg.png" alt="Simple to use for everyone, powerful customizability for advanced users." /><em>Simple to use for everyone, powerful customizability for advanced users.</em></p>

<p>“Wait a second… <em>free</em>? What’s the catch?”, you’re asking, “Are you guys trying to Zuck us over in some hidden, nefarious way?”</p>

<p>Nope. We’re pretty open about how we pay the bills. Lockdown lets you automatically block trackers with its free Firewall, but if you want more protection by hiding your IP address and encrypting your connections (for safety on public wi-fi and insecure sites/apps), you can pay for Lockdown’s <a href="https://openlyoperated.org/report/confirmedvpn">fully-audited</a> Secure Tunnel (VPN) service. Revenue goes to keeping the Firewall free and updated with the constantly changing (and increasingly clever) landscape of trackers, ads, and badware.</p>

<p>We believe people and companies that build privacy products have a unique responsibility to be more transparent than any other product line. That’s why Lockdown is 100% open source and <a href="https://openlyoperated.org">openly operated</a> — so that anyone can see what it’s doing, and just as importantly, what it’s <em><a href="https://techcrunch.com/2019/02/21/facebook-removes-onavo/">not</a></em> <em><a href="https://www.computerweekly.com/news/252466203/Top-VPNs-secretly-owned-by-Chinese-firms">doing</a></em>.</p>

<p>We built Lockdown because it’s something we wished existed: a simple, transparent, and powerful tool for stopping invasive third-party tracking. We hope it can do the same for you. Get it for free at <a href="https://lockdownprivacy.com">LockdownPrivacy.com</a>.</p>

<p>(Update: Lockdown was featured by App Store and also <a href="https://www.forbes.com/sites/kateoflahertyuk/2020/03/06/meet-lockdown-the-app-that-reveals-whos-tracking-you-on-your-iphone/#3ef7be1b59eb">featured in Forbes</a> — check it out and share the story! )</p>]]></content><author><name>Lockdown Privacy</name></author><summary type="html"><![CDATA[Are there people who want their data to be sold?]]></summary></entry><entry><title type="html">Can You Trust the Apps and Sites You Use?</title><link href="https://blog.lockdownprivacy.com/2020/12/02/why-you-cant-trust.html" rel="alternate" type="text/html" title="Can You Trust the Apps and Sites You Use?" /><published>2020-12-02T08:00:00+00:00</published><updated>2020-12-02T08:00:00+00:00</updated><id>https://blog.lockdownprivacy.com/2020/12/02/why-you-cant-trust</id><content type="html" xml:base="https://blog.lockdownprivacy.com/2020/12/02/why-you-cant-trust.html"><![CDATA[<h4 id="lets-hope-every-tech-company-steals-this-idea">Let’s hope every tech company steals this idea.</h4>
<!--more-->

<p>When you send a photo to someone, your messaging app actually first sends the photo to an app’s server, which then sends the photo to them:</p>

<p><img src="/assets/images/1*KKVbLVyMWiypj0Qaopg27A.png" alt="" /></p>

<p>And sure, in the 90’s, this might have been what happened. But somewhere along the line, someone figured out how to profit from user data, and so now here’s what <em>actually</em> happens:</p>

<p><img src="/assets/images/1*vHOaHP4CGZdluwt7m3xoOw.png" alt="" /></p>

<p>And that’s just sending photos. Today, you give apps access to your camera, location, microphone, contacts, browsing habits, even your medical records. After you tap “Allow” once, an app can even upload your entire photo and video library to their servers in the background while you’re sleeping.</p>

<p>The Internet is facilitating an <a href="https://theintercept.com/2017/04/24/stop-using-unroll-me-right-now-it-sold-your-data-to-uber/">insane</a> <a href="https://www.theverge.com/2018/4/24/17275994/yahoo-sec-fine-2014-data-breach-35-million">free-for-all</a> <a href="https://www.forbes.com/sites/kashmirhill/2014/10/03/god-view-uber-allegedly-stalked-users-for-party-goers-viewing-pleasure/#75ddf2383141">for</a> <a href="https://www.npr.org/sections/thetwo-way/2017/03/14/520123490/vibrator-maker-to-pay-millions-over-claims-it-secretly-tracked-use">our</a> <a href="https://www.reuters.com/article/us-facebook-privacy-firing/facebook-employee-fired-over-bragging-about-access-to-user-information-idUSKBN1I334E">personal</a> <a href="https://www.clickondetroit.com/news/concerns-over-misuse-of-childrens-online-data-grow-as-apps-illegally-collect-sell-information">data</a>, with potential consequences getting <a href="https://www.nytimes.com/2018/03/04/technology/fake-videos-deepfakes.html">worse</a>. Apps even exploit this data with <a href="https://www.ibtimes.com/how-uber-other-digital-platforms-could-trick-us-using-behavioral-science-unless-we-2791467">behavioral science</a> to squeeze every <a href="https://clark.com/shopping-retail/mac-users-being-fed-pricier-hotel-searches/">dollar</a> or <a href="https://www.businessinsider.com/how-app-developers-keep-us-addicted-to-our-smartphones-2018-1">minute</a> out of their users, when it’s <a href="https://www.washingtonpost.com/news/monkey-cage/wp/2018/08/06/its-no-accident-that-facebook-is-so-addictive/?utm_term=.1058706f817b">clearly</a> against the <a href="https://www.vox.com/the-goods/2018/10/30/18044678/kids-apps-gaming-manipulative-ads-ftc">users’ best interests</a>. Today, companies have every incentive to exploit our data for profit, and no incentive to protect our privacy.</p>

<p>Since we’re only going to rely more on apps over time, the critical question is:</p>

<h2 id="how-do-you-know-if-you-can-trust-an-app"><strong>How do you know if you can trust an app?</strong></h2>

<h3 id="trust-through-privacy-policy">Trust Through Privacy Policy?</h3>

<p>When you ask a company about protecting your data, they respond by telling you to read their Privacy Policy, which is a document they wrote (or <a href="https://duckduckgo.com/?q=privacy+policy+generator">copy-pasted</a>) that promises they’ll protect your data.</p>

<p>But wait, isn’t that circular logic? I should trust that they’re protecting my data because… they have a document that says they’ll protect my data? How do I know they’re doing any of the things they claim in the Privacy Policy?</p>

<p>It turns out it’s impossible to know if an app company is violating their Privacy Policy (or violating privacy laws), because there’s literally nothing stopping them: they’re Privacy <em>Policies</em>, not Privacy <em>Proofs.</em> Not only that, they’re actually not <a href="https://ir.lawnet.fordham.edu/iplj/vol27/iss1/5/">legally binding</a>, and in the rare cases when companies actually *do *get caught, <a href="https://www.abine.com/blog/2012/facebook-privacy-violated-by-new-ads/">the</a> <a href="https://www.theverge.com/2018/4/24/17275994/yahoo-sec-fine-2014-data-breach-35-million">penalties</a> <a href="https://uk.reuters.com/article/us-facebook-france/facebook-fined-150000-euros-by-french-data-watchdog-idUKKCN18C10C">are</a> <a href="http://www.consumerwatchdog.org/blog/google-ruling-shows-need-do-not-track-and-strong-antitrust-action">unbelievably light</a>. And as recent government (in)action on <a href="https://www.reuters.com/article/us-usa-equifax-cfpb/exclusive-u-s-consumer-protection-official-puts-equifax-probe-on-ice-sources-idUSKBN1FP0IZ">data breaches</a>, <a href="https://www.npr.org/2017/03/28/521831393/congress-overturns-internet-privacy-regulation">ISP privacy rules</a>, and <a href="https://www.cnet.com/news/net-neutrality-is-now-really-officially-dead-open-internet-congress-now-what/">net neutrality</a> show, often there are no penalties at all.</p>

<p>Privacy Polices and regulations do not create real trust, and they only serve to provide a false sense of security or privacy.</p>

<h3 id="trust-through-pricing">Trust Through Pricing?</h3>

<p>It’s a common saying on the internet: “If the product is free, then you’re the product.” And while that’s sometimes true since revenue must come from somewhere, some people make the <a href="https://en.wikipedia.org/wiki/Denying_the_antecedent">logical fallacy</a> of thinking the inverse must also be true: “If the product is not free, then you’re not the product.”</p>

<p>Due to this mistake, some people use price as a criterion when choosing apps to use, by looking for apps that aren’t free and making the false assumption that non-free products will not exploit their data for profit.</p>

<p>Of course, it’s very possible and just as likely for a company to both charge you for an app while also profiting off of your data or having poor security. Therefore, pricing is a bad criterion for finding an app that you can trust.</p>

<h3 id="trust-through-aestheticsdesign">Trust Through Aesthetics/Design?</h3>

<p>Woah, those app screenshots look so sleek! And their website is so colorful and tastefully designed, with beautiful animations that you simply can’t resist. Why would an adorable cartoon bear lie to you? Is that even possible?</p>

<p>Well sadly, yes — cartoon characters lie all the time. Since they were created by a human and their dialogue is written by a human, an adorable cartoon bear is not less likely to exploit your personal data for profit. It might look cuter while doing it though.</p>

<p>The aesthetics of a website might tell you that that they spent $20 on a SquareSpace theme (or pirated it), but say nothing about how trustable an app or service is — it‘s even possible that the company skimped on data security in order to spend more on their website’s design and animations.</p>

<h3 id="trust-through-popularity">Trust Through Popularity?</h3>

<p>If all your friends jumped off a digital bridge, would you? At one point, Yahoo had over three billion accounts, and in 2013, they broke the world record 🎉for biggest data breach ever, by a <a href="https://www.csoonline.com/article/2130877/the-biggest-data-breaches-of-the-21st-century.html">very long shot</a>. Since then, there have been many more breaches of tens or hundreds of millions accounts of other companies. And these are only counting disclosed and known breaches — nobody knows what the real numbers are.</p>

<p>Popularity isn’t a reliable proxy of how trustworthy an app is. In fact, there are even scam apps that make it into the <a href="https://blog.lockdownprivacy.com/2020/11/25/how-to-make-80000.html">top charts</a> of the App Store.</p>

<h2 id="so-what-actually-creates-trust">So what actually creates trust?</h2>

<p>Apps should have to <em>earn the trust of its users</em>, especially when there are such strong financial incentives for companies to simply lie and abuse user data.</p>

<p>To earn user trust, apps should be fully transparent— the public should be able to see everything the app and its servers are doing, so that anyone can verify that there’s no negligent, dishonest, or even malicious activity. In other words: trust through transparency.</p>

<h3 id="trust-through-transparency">Trust Through Transparency</h3>

<p>Full transparency means making the <em>entire operation</em> of an app public and verifiable, from the app code on your phone or computer, to the server code and infrastructure on the cloud, to the actions of the company’s employees, <em>plus</em> proof of all of that. It’s *everything *that touches your data. Everything.</p>

<p>If getting full and verifiable transparency from the apps we use every day seems like a radical idea, it’s because we’ve been trained for so long to expect so little from companies. We’ve been trained to upload our personal data, cross our fingers, and simply hope for the best. The truth is, if we’re giving companies our most sensitive personal information, why shouldn’t we expect them to give us proof of exactly what they’re doing with it?</p>

<h3 id="a-standard-for-transparency">A Standard For Transparency</h3>

<p>To be clear, <em>partial</em> transparency is insufficient and misleading, because it still allows “bad bits” to be hidden, defeating the purpose of transparency. For example, a company hiding just a small part of their server code is still able to secretly copy all user data to unknown third parties from their servers.</p>

<p>So how do we know if an app is being fully transparent, versus only partially transparent or not transparent at all?</p>

<p>A standard for full transparency doesn’t exist today, so we’re creating one and giving it away for free.</p>

<p>This new standard is called <strong><a href="https://openlyoperated.org">Openly Operated</a></strong>, because full transparency requires the entire* operation* of an app to be <em>open</em> and verifiable. This includes making public all app source code, server code, infrastructure, and employee actions, as well as providing proof of accuracy and validity. It’s like giving the public read-only access to the app operator’s Admin console (example <a href="https://openlyoperated.org/report/openlyoperated#read-only-account">here</a>).</p>

<p>How this is different from apps today? Here’s the photo-sending example from the beginning again — except this time, the app is Openly Operated:</p>

<p><img src="/assets/images/1*rJLMAkXuf-6lTJj6Qj1Oqw.png" alt="" /></p>

<p>Unlike the earlier examples, the Openly Operated <a href="https://openlyoperated.org/how-to">certification process</a> forces the app to be fully and verifiably transparent, preventing the app’s operators from hiding privacy and security issues. This process, at a high level, is:</p>

<ol>
  <li>
    <p>The app fulfills specific <strong><a href="https://openlyoperated.org/how-to#fulfill-requirements">requirements</a></strong> to demonstrate full transparency, and uses direct references to <a href="https://openlyoperated.org/how-to#open-source">source code</a>, <a href="https://openlyoperated.org/how-to#open-infrastructure">infrastructure</a>, and other evidence to <a href="https://openlyoperated.org/how-to#claims-with-proof">prove the app’s privacy or security claims</a>.</p>
  </li>
  <li>
    <p>Combine these requirements and proof of claims into an Openly Operated <strong><a href="https://openlyoperated.org/how-to#assemble-audit-kit">Audit Kit</a></strong> that anyone can publicly view and verify.</p>
  </li>
  <li>
    <p>Get matched with independent <a href="https://openlyoperated.org/auditors">auditors</a>, who verify the Audit Kit to produce public Openly Operated <strong><a href="https://openlyoperated.org/reports">Audit Reports</a></strong>, detailing their verifications and providing a summary.</p>
  </li>
</ol>

<p>This lets everyone participate in “trust through transparency”: users who are more technical can perform verifications themselves by diving into the nitty gritty details in the Audit Kit, while less tech-savvy users can read the independent Audit Reports and summaries. Openly Operated’s transparency is the opposite of the status quo, where apps simply tell users to read their totally unproven and unverifiable Privacy Policy.</p>

<p><a href="https://openlyoperated.org">Openly Operated</a> is a free certification. <a href="https://openlyoperated.org/about-us">Its mission</a> is for all apps to earn trust through transparency, so all <a href="https://openlyoperated.org/how-to">documentation</a> is available at no cost, and companies pay nothing to license the certification. We’ve even <a href="https://openlyoperated.org/reports">built examples</a> to show that Openly Operated apps are possible. These are more than proof-of-concepts — they’re in production, fully functional, and are operating at scale with real users.</p>

<h2 id="everything-should-be-openly-operated">Everything Should Be Openly Operated</h2>

<p>Companies have been blatantly dishonest with how they handle and secure user data for too long. Since its creation until now, Facebook has had a privacy setting for user posts labeled “Only Me”. To any regular person, “Only Me” has a simple meaning: me, and literally nobody else.</p>

<p>But over the last ten years, we’ve learned the hard way that Facebook has a very different definition of “Only Me”. To Facebook, “Only Me” means “Me and <a href="https://www.cbsnews.com/news/facebook-your-personal-info-for-sale/">All Of</a> <a href="http://content.time.com/time/nation/article/0,8599,1532225,00.html">Facebook’s</a> <a href="http://fortune.com/2017/10/27/facebook-russian-election-ads/">Advertisers</a> and <a href="https://www.bloomberg.com/news/articles/2018-04-04/facebook-scans-what-you-send-to-other-people-on-messenger-app">Their</a> <a href="https://www.axios.com/facebook-whatsapp-targeted-ads-user-privacy-c1e18e9b-ed76-4954-ab74-a64a88647e8c.html">Partners</a> and Some Of <a href="http://fortune.com/2018/04/03/facebook-videos-delete-personal-data/">Facebook’s</a> <a href="https://motherboard.vice.com/en_us/article/bjp9zv/facebook-employees-look-at-user-data">25,000</a> <a href="https://thehackernews.com/2015/02/facebook-acccount-password.html">Employees</a> and Some <a href="https://www.theverge.com/2019/5/6/18530887/facebook-instagram-ai-data-labeling-annotation-private-posts-outsourced">Unknown Number</a> <a href="https://www.reuters.com/article/us-facebook-privacy-firing/facebook-employee-fired-over-bragging-about-access-to-user-information-idUSKBN1I334E">Of Contractors</a> and <a href="https://www.rappler.com/technology/news/200508-cambridge-analytica-other-facebook-quiz-apps-brittany-kaiser">Facebook Apps That Friends</a> or <a href="https://www.cnbc.com/2018/04/08/cubeyou-cambridge-like-app-collected-data-on-millions-from-facebook.html">I Have Used</a> and <a href="http://www.latimes.com/business/la-fi-facebook-sells-data-to-chinese-20180605-story.html">Those Apps’</a> <a href="https://www.theguardian.com/news/2018/mar/20/facebook-data-cambridge-analytica-sandy-parakilas">Employees</a> and <a href="https://www.cnbc.com/2018/04/16/facebook-collects-data-even-when-youre-not-on-facebook.html">Anyone Those Apps</a> <a href="https://www.marketwatch.com/story/spooked-by-the-facebook-privacy-violations-this-is-how-much-your-personal-data-is-worth-on-the-dark-web-2018-03-20">Share Or Sell Data To</a>… <a href="https://www.ftc.gov/news-events/press-releases/2011/11/facebook-settles-ftc-charges-it-deceived-consumers-failing-keep">Maybe</a>”.</p>

<p><img src="/assets/images/1*El8rgOdv_tVUSkEwkqenZg.png" alt="Probably need a smaller font to fit the truth here." /><em>Probably need a smaller font to fit the truth here.</em></p>

<p>Privacy and security scandals happen every week not because companies are evil, but because like anything else, companies operate on incentives. In a world where there’s no way to verify an app’s security or privacy claims, why should a company be honest and make less money, while their competitors are being dishonest and making more money? Current incentives give dishonest and insecure companies an edge to grow faster, compete more efficiently, spend more on marketing, and capture the most customers.</p>

<p>Openly Operated provides a structured way for companies to <em>prove</em> their privacy and security claims. Users have nothing to lose and everything to gain by demanding transparency from the apps they give their personal data to. The question shouldn’t be “Why should the apps I use be transparent?” — it should be “Why <em>aren’t</em> the apps I use transparent? What are they hiding?”</p>

<p>Learn more at <a href="https://openlyoperated.org">OpenlyOperated.org</a>. Whether you’re a user curious about the <a href="https://openlyoperated.org/user-benefits">many benefits</a> of transparency, an engineer <a href="https://openlyoperated.org/how-to">building apps people can trust</a>, or a company that wants to <a href="https://openlyoperated.org/for-companies">win customers while increasing security</a>, Openly Operated has something to offer you.</p>

<p>Wouldn’t it be nice if “Only Me” really meant “Only Me”?</p>]]></content><author><name>Lockdown Privacy</name></author><summary type="html"><![CDATA[Let’s hope every tech company steals this idea.]]></summary></entry><entry><title type="html">How to Make $80,000 Per Month on the Apple App Store</title><link href="https://blog.lockdownprivacy.com/2020/11/25/how-to-make-80000.html" rel="alternate" type="text/html" title="How to Make $80,000 Per Month on the Apple App Store" /><published>2020-11-25T08:00:00+00:00</published><updated>2020-11-25T08:00:00+00:00</updated><id>https://blog.lockdownprivacy.com/2020/11/25/how-to-make-80000</id><content type="html" xml:base="https://blog.lockdownprivacy.com/2020/11/25/how-to-make-80000.html"><![CDATA[<h4 id="its-far-easier-than-you-think-no-luck-or-perseverance-necessary">It’s far easier than you think. No luck or perseverance necessary.</h4>
<!--more-->

<p><em>Note: I originally published this story on Medium, where it received 45,000 likes (#52 overall that year). I’m republishing it here because writing this helped spark my interest in transparency and trust in software, and eventually building <a href="https://lockdownprivacy.com" target="_blank">Lockdown Privacy</a>, <a href="https://openlyoperated.org" target="_blank">Openly Operated</a>, and <a href="https://privacyreview.co" target="_blank">Privacy Review</a>.</em></p>

<p>At WWDC, Apple reported that they’ve paid out $70 billion to developers, with 30% of that ($21 billion!) in the last year. That’s a <em>huge</em> spike, and surprising to me because it didn’t seem like my friends and I were spending more on apps last year. But that’s anecdotal, so I wondered: Where are these revenues coming from? I opened App Store to browse the top grossing apps.</p>

<h2 id="step-1-follow-the-money">Step 1: Follow The Money</h2>

<p><img src="/assets/images/1*FDxvuLRkNMRRVvUda36LFA.png" alt="" /></p>

<p>I scrolled down the list in the Productivity category and saw apps from well-known companies like Dropbox, Evernote, and Microsoft. That was to be expected. But what’s this? The #10 Top Grossing Productivity app (as of June 7th, 2017) was an app called “Mobile protection :Clean &amp; Security VPN”.</p>

<p>Given the terrible title of this app (inconsistent capitalization, misplaced colon, and grammatically nonsensical “Clean &amp; Security VPN?”), I was sure this was a bug in the rankings algorithm. So I check Sensor Tower for an estimate of the app’s revenue, which showed… $80,000 per month?? That couldn’t possibly be right. Now I was *really *curious.</p>

<p>I tap into the app details to see that the developer is “Ngan Vo Thi Thuy”. Wait so, this is a VPN service offered by an independent developer who didn’t even bother to incorporate a company? That’s a huge red flag. For those of you who don’t know why this is bad, a VPN basically routes all your internet traffic through a third party server. So in this case, a random person who couldn’t piece together a grammatically correct title, who also didn’t bother to incorporate a company, wants access to all your internet traffic.</p>

<p>Another red flag was this comically terrible app description:</p>

<p><img src="/assets/images/1*8evBUPzd_qAo0A7YUavHaQ.png" alt="Direct screenshot from the “Mobile protection :Clean &amp; Security VPN” app description." /><em>Direct screenshot from the “Mobile protection :Clean &amp; Security VPN” app description.</em></p>

<p>According to this, “Mobile protection :Clean &amp; Security VPN” is “Full of features” — well, it’s certainly full of <em>something</em>. Apparently, “Mobile protection” includes protecting you from “dupplicate” contacts. And these “scans” are what the screenshots claim as “Quick &amp; Full Scan Internet Security”. Five internets to anyone who can figure out the relationship between Internet Security and duplicate contacts.</p>

<p>All these red flags — and I haven’t even downloaded the app yet. I check the Reviews tab to find some vague, fake-looking 5-star reviews:</p>

<p><img src="/assets/images/1*lP_-XxGuY2G3D5kwNNqgvA.png" alt="" /></p>

<p>Seeing the dates on these reviews brought up another question. How long has this app been up? Well, according to Sensor Tower, “Mobile protection :Clean &amp; Security VPN” has been a top 20 grossing Productivity app since at least April 20th (almost 2 months now).</p>

<h2 id="step-2-duplicitous-behavior">Step 2: Duplicitous Behavior</h2>

<p>Out of curiosity about this supposedly top grossing app, I download it. Here’s what happens when I open it for the first time:</p>

<p><img src="/assets/images/1*0CLMsW13YkPTf6gJ_-gDLA.gif" alt="" /></p>

<p>Yes, “<strong>This app need to cccess to your Contact to scan your Contact first.</strong>” The only option here is to tap Agree, and then iOS asks me if I want to give this app “cccess” to my contacts. Uhm, no thank you?</p>

<p><img src="/assets/images/1*5diXWOginryZgNa8RFWOVQ.png" alt="" /></p>

<p>After skipping that, the app tells me my device is at risk. Of course it is. It‘s also ready to “Device Analyze”, Quick and Full Scan, and secure my internet (I can’t wait!).</p>

<p>Tapping “Device Analyze” shows my iPhone’s free memory and storage — a useless and irrelevant feature.</p>

<p>Tapping both Quick Scan and Full Scan shows:</p>

<p><strong>“Your contact is cleaned. No dupplicated found.”</strong></p>

<p>Oh good — no duplicates, except for the extra “p” in “dupplicated”, I guess? 🤷🏻‍♂️</p>

<p>Okay, so let me finally secure my internet by tapping “Secure Internet”. Hmm, what’s this—?</p>

<p><img src="/assets/images/1*Pl6qqjlmplxyOGtgBg-9PA.png" alt="play WITHOUT installing? oh boy!" /><em>play WITHOUT installing? oh boy!</em></p>

<p>Up comes this incredibly generous offer to play a bubble shooter game *without *installing! Not sure what I did to deserve this amazing free gift, but it will have to wait. I tap the “X” to return to securing my internet.</p>

<p>Here’s the next screen:</p>

<p><img src="/assets/images/1*N1cgm5SZUGDsy7CTiHlHRw.png" alt="Such generous. Much design. Very scam." /><em>Such generous. Much design. Very scam.</em></p>

<p>And obviously, I jump at the opportunity to “Instantly use full of smart anti-virus” by tapping “FREE TRIAL”. It’s free, after all.</p>

<p><img src="/assets/images/1*l-1_qUqc2GYRrKoXszHkGg.png" alt="" /></p>

<p>Touch ID? Okay! Wait… let’s read the fine print:</p>

<p><strong>“Full Virus, Malware scanner”</strong>: What? I’m pretty sure it’s impossible for any app to scan my iPhone for viruses or malware, since third party apps are sandboxed to their own data, but let’s keep reading…</p>

<p><strong>“You will pay $99.99 for a 7-day subscription”</strong></p>

<p>Uhh… come again?</p>

<p>Buried on the third line in a paragraph of text in small font, iOS casually tells me that laying my finger on the home button means I agree to start a $100 subscription. And not only that, but it’s $100 PER WEEK? I was one Touch ID away from a <strong>$400 A MONTH subscription to reroute all my internet traffic to a scammer?</strong></p>

<p>I guess I was lucky I actually read the entire fine print. But what about other people?</p>

<h2 id="step-3-its-all-starting-to-ad-up-to-profit">Step 3: It’s All Starting to “Ad” Up… to Profit</h2>

<p>It suddenly made a lot of sense how this app generates $80,000 a month. At $400/month per subscriber, it only needs to scam 200 people to make $80,000/month, or $960,000 a year. Of that amount, Apple takes 30%, or $288,000 — from just this one app.</p>

<p>At this point, you might still be in disbelief. Maybe you’re thinking: “Sure, just 200 people, but still, it seems highly unlikely that even one person would download this scammy looking app, much less pay for it.”</p>

<p>Maybe you wouldn’t download it. I certainly wouldn’t. But I’ve also never clicked on a Google Ad, yet Google somehow rode Adwords to $700 billion dollars today. “Mobile protection :Clean &amp; Security VPN” is currently ranked #144 in most downloaded free productivity apps in the App Store, with an estimated 50,000 downloads in April.</p>

<p>To get 200 subscribers from 50,000 downloads, they just need to convert 0.4% to purchases — or maybe even fewer, because <strong>these subscriptions are automatically renewing</strong>, so the subscribers stack month over month. <strong>Can you really not imagine one of your tech-illiterate relatives accidentally (or even intentionally) subscribing to this “free trial” to protect their iPad from viruses?</strong></p>

<p>But how did this app get 50,000 downloads in the first place?</p>

<p>I remembered reading that a large percentage of apps were discovered through app store search. So maybe, this app somehow had really good App Search Optimization. I searched the app store for “virus scanner”:</p>

<p><img src="/assets/images/1*cRWgmFj05K-GnCkdMdMSbA.png" alt="" /></p>

<p>The first result is an ad for “Protection for iPhone — Mobile Security VPN”. Well, that sounds familiar. This isn’t the same app, but this one’s In-App Purchase is “Free Trial to Premium Protection” for $99.99, and it’s ranked #33 for Top Grossing in the Business category.</p>

<p>Turns out, scammers are abusing Apple’s relatively new and immature App Store Search Ads product. They’re taking advantage of the fact that there’s no filtering or approval process for ads, and that ads look almost indistinguishable from real results, and some ads take up the entire search result’s first page.</p>

<p>Later, I dug deeper to find that unfortunately, these aren’t isolated incidents — they’re fairly common in the app store’s top grossing lists. And this isn’t just happening with security related keywords. It seems like scammers are bidding on many other keywords. Here’s a search for “wifi”:</p>

<p><img src="/assets/images/1*AiHE5tvxiM0HseFkuHy2Jg.png" alt="" /></p>

<p>The top result is an ad for “WEP Password Generator”, a simple random string generator that charges $50/month. It’s already making $10,000 per month, despite being released in April. It’s likely a clone of <a href="https://sensortower.com/ios/us/jose-maria-delgado-delgado/app/wifi-password-generator-wep-keys-for-your-modem/884331356/">this app</a>, which indicates that yes, this scheme has become so large that scammers are copying each other.</p>

<h2 id="fixing-the-app-store-what-you-can-do">Fixing the App Store: What You Can Do</h2>

<p>Well first, if you’re reading this as a developer who also happens to have a less-than-average sense of morality, congratulations! You’ve just learned a (relatively) easy way to make tens of thousands of dollars on Apple’s App Store — at least until they change something. Otherwise, here’s a few suggestions:</p>

<ol>
  <li>
    <p>Teach your less tech-savvy friends and relatives how to <a href="https://www.imore.com/how-to-cancel-app-store-subscription-ipad-iphone-mac-apple-tv">check and disable subscriptions</a>. If they’re affected, have them get <a href="https://www.imore.com/how-to-get-refund-itunes-app-store">refunds</a>.</p>
  </li>
  <li>
    <p>Report scam apps when you see them with the iTunes Connect <a href="https://itunesconnect.apple.com/WebObjects/iTunesConnect.woa/wa/jumpTo?page=contactUs">Contact Us</a> form. Select “Feedback and Concerns” and “Report a Fraud Concern”.</p>
  </li>
  <li>
    <p>Signal boost until Apple fixes this by sharing with friends and family.</p>
  </li>
</ol>

<h2 id="fixing-the-app-store-what-apple-should-do">Fixing the App Store: What Apple Should Do</h2>

<p>It’s somewhat hard to believe that Apple isn’t aware of this problem, since these apps aren’t small frys — they’re all over the top lists on the App Store. It could be that they simply don’t consider it a big enough problem to deal with it, or that it just happens to be a very profitable problem for their Search Ads and App Store platforms. Either way, here are some suggestions:</p>

<ol>
  <li>
    <p><strong>Remove Scams and Refund Users</strong>: The most obvious. Simply hire someone to proactively and regularly scroll through top apps and remove scams. As you can tell from above, these are not hard to spot at all. And for people who have purchased scam subscriptions, automatically and fully refund all of those past purchases.</p>
  </li>
  <li>
    <p><strong>Better UI on Touch ID Subscriptions</strong>: Don’t use small, fine print with the price buried in the text (see “Free Trial” screenshot above). The price should be much more prominent, with possibly a required 5-second delay before a purchase can be made. As a bonus, maybe show the app’s most useful/recent ratings or reviews here.</p>
  </li>
  <li>
    <p><strong>Stricter Review of Subscriptions</strong>: How do in-app purchases called “Full Virus, Malware Scanner” get approved by app review for $400 per month? Is anyone home? When a layperson sees this name in an email receipt with a slick green badge icon, they probably don’t cancel it because it looks as official as their Apple Music receipts. And for some apps like <a href="https://sensortower.com/ios/us/ngoc-nguyen/app/security-mobile-vpn-protection-anti-track-virus/1209482476/">this one</a>, despite its in-app purchase being named “Free Trial to Premium”, it wasn’t a trial at all — it was an immediate purchase.</p>
  </li>
  <li>
    <p><strong>Prompt for Delete Subscription on App Deletion:</strong> Many 1-star reviewers on scam apps said they were getting charged even though they deleted the app. To most people, that’s the way it should work — so why doesn’t it work that way? When a user deletes an app, ask the user if they want to also cancel their subscription. Of course, confirm it again so they don’t accidentally cancel their Netflix.</p>
  </li>
  <li>
    <p><strong>Easier Cancellation of Subscriptions</strong>: Subscriptions are so difficult to cancel, it’s almost as if the design-focused Apple intentionally made it hard. On iOS 10, cancelling a subscription is literally a <a href="https://www.imore.com/how-to-cancel-app-store-subscription-ipad-iphone-mac-apple-tv">nine step process</a>*. *Even installing a third-party keyboard is easier (six steps). Make this simpler, please. And no, the tiny “Report a Problem” button on email receipts isn’t enough. (Update: I’m actually <a href="http://i.imgur.com/qwAn4ZT.png">unable to refund</a> one of the scam subscriptions, even through the official Apple links.)</p>
  </li>
  <li>
    <p><strong>Fraud- and Abuse-Proof Search Ads</strong>: Part of what makes this scam easy to run is the how new Store Search Ads are. Many regular users probably don’t even know they’re clicking an ad. At the least, Apple should review ads for potential fraud before running them (Facebook and Google both do this), and make it more obvious that the top result is an ad.</p>
  </li>
  <li>
    <p><strong>Fine and Take Legal Action</strong>: This suggestion is last because it’s unlikely Apple will do it. There’s currently no incentive not to build scam apps. The worst that can happen is getting your account deleted — which doesn’t matter, because you still keep the ill-gotten profits, and you can still make a new account and do it again. Create a deterrent effect by fining and taking legal action against the worst offenders.</p>
  </li>
</ol>

<h2 id="final-note">Final Note</h2>

<p>App developers take pride in the fact that if their creation adds value, or improves peoples’ lives in some way, then people will be happy to pay for it, and everybody benefits. Not only that, but making good apps requires design, engineering, and sales skills, as well as a ton of dedication and hard work.</p>

<p>So, aside from the obvious moral wrongs of exploiting the vulnerable for profit, it’s extremely disheartening to know that some developers are becoming financially successful the easy and unethical way — by making bogus apps that take a few hours to code, and whose functionality is purely to steal from the less well-informed.</p>]]></content><author><name>Lockdown Privacy</name></author><summary type="html"><![CDATA[It’s far easier than you think. No luck or perseverance necessary.]]></summary></entry></feed>