<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>Comments on: Parental choice…</title>
	<atom:link href="http://blog.ferpasherpa.org/?feed=rss2&#038;p=286" rel="self" type="application/rss+xml" />
	<link>http://blog.ferpasherpa.org/?p=286</link>
	<description>Parent Perspectives on Privacy</description>
	<lastBuildDate>Thu, 21 Jan 2016 22:17:49 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.2.2</generator>
	<item>
		<title>By: Kris Alman</title>
		<link>http://blog.ferpasherpa.org/?p=286#comment-237144</link>
		<dc:creator><![CDATA[Kris Alman]]></dc:creator>
		<pubDate>Thu, 21 Jan 2016 22:17:49 +0000</pubDate>
		<guid isPermaLink="false">http://blog.ferpasherpa.org/?p=286#comment-237144</guid>
		<description><![CDATA[Unless there is a Roe v Wade challenge to data ownership, &quot;parental choice&quot; is an illusion. 

The ACLU recently concluded that school personnel fail to protect student privacy. http://www.educationnews.org/technology/aclu-says-massachusetts-schools-have-weak-data-privacy-policies/

We unknowingly produce data when consuming it on the Internet. Information inequality manifests as unknown unknowns (credit to Donald Rumsfeld). The business model for digital consumption puts data custodians in charge of data.

Because of FERPA deregulations, data custodians are much more than glorified janitors. As &quot;school officials&quot;/&quot;authorized representatives,&quot; they own the keys to the front door and can &quot;free data&quot; in ways we cannot control. In this dystopian relationship, we don&#039;t even own a remote control to access our own data!

Data custodians are supposed to clean up platforms so they are secure. They are supposed to lock software backdoors so that malware cannot be introduced. 

And we’re just supposed to trust self-regulation and privacy pledges! That’s like asking kindergarteners to peer-grade for spelling errors. How do we know whether data custodians really play by their rules? 

Then there’s this little caveat in the privacy pledge:
Nothing in this pledge is intended to prohibit the use of student personal information for purposes of adaptive learning or customized education.

With artificial intelligence, a computer adaptively learns. Whether kids learn with algorithmic “customized” education (aka &quot;teaching machines&quot;/&quot;robot teachers&quot;) is another question. It may be possible algorithmic education rewires the brain and dismantles the capacity to apply and synthesize knowledge—especially for young, developing minds.

Are we prepared for the unknown unknowns of algorithmic education? 

FERPA deregulations in 2008 and 2011 created a &quot;free data&quot; ecosystem with &quot;free market&quot; enterprise as the outcome. Where SOPIPA and its clones (including  the Oregon Student Information Protection Act) fail is they are unenforceable.

Consider the conditional language:
&quot;An operator shall (or may) not knowingly engage in any of the following activities...&quot; 

Switch that to: &quot;An operator shall (or may) unknowingly engage in any of the following activities…&quot;

Consider what I call the Google exemption clause: 
“This section does not apply to general audience Internet websites, general audience online services, general audience online applications or general audience mobile applications, even if login credentials created for an operator’s site, service or application may be used to access those general audience sites, services or applications.”

Do we just trust software app developers and companies like Google? The Electronic Frontier Foundation maintains that Google violates its privacy pledge in an FTC complaint. SIIA says this is just “misunderstandings.” http://blog.siia.net/index.php/2015/12/some-misunderstandings-of-the-student-privacy-pledge/

The stakes are too high when EdTech data miners stake a claim on minors’ personal data. We have secrets. Are we entitled to a right to privacy? Should we © our identity to keep them private?]]></description>
		<content:encoded><![CDATA[<p>Unless there is a Roe v Wade challenge to data ownership, &#8220;parental choice&#8221; is an illusion. </p>
<p>The ACLU recently concluded that school personnel fail to protect student privacy. <a href="http://www.educationnews.org/technology/aclu-says-massachusetts-schools-have-weak-data-privacy-policies/" rel="nofollow">http://www.educationnews.org/technology/aclu-says-massachusetts-schools-have-weak-data-privacy-policies/</a></p>
<p>We unknowingly produce data when consuming it on the Internet. Information inequality manifests as unknown unknowns (credit to Donald Rumsfeld). The business model for digital consumption puts data custodians in charge of data.</p>
<p>Because of FERPA deregulations, data custodians are much more than glorified janitors. As &#8220;school officials&#8221;/&#8221;authorized representatives,&#8221; they own the keys to the front door and can &#8220;free data&#8221; in ways we cannot control. In this dystopian relationship, we don&#8217;t even own a remote control to access our own data!</p>
<p>Data custodians are supposed to clean up platforms so they are secure. They are supposed to lock software backdoors so that malware cannot be introduced. </p>
<p>And we’re just supposed to trust self-regulation and privacy pledges! That’s like asking kindergarteners to peer-grade for spelling errors. How do we know whether data custodians really play by their rules? </p>
<p>Then there’s this little caveat in the privacy pledge:<br />
Nothing in this pledge is intended to prohibit the use of student personal information for purposes of adaptive learning or customized education.</p>
<p>With artificial intelligence, a computer adaptively learns. Whether kids learn with algorithmic “customized” education (aka &#8220;teaching machines&#8221;/&#8221;robot teachers&#8221;) is another question. It may be possible algorithmic education rewires the brain and dismantles the capacity to apply and synthesize knowledge—especially for young, developing minds.</p>
<p>Are we prepared for the unknown unknowns of algorithmic education? </p>
<p>FERPA deregulations in 2008 and 2011 created a &#8220;free data&#8221; ecosystem with &#8220;free market&#8221; enterprise as the outcome. Where SOPIPA and its clones (including  the Oregon Student Information Protection Act) fail is they are unenforceable.</p>
<p>Consider the conditional language:<br />
&#8220;An operator shall (or may) not knowingly engage in any of the following activities&#8230;&#8221; </p>
<p>Switch that to: &#8220;An operator shall (or may) unknowingly engage in any of the following activities…&#8221;</p>
<p>Consider what I call the Google exemption clause:<br />
“This section does not apply to general audience Internet websites, general audience online services, general audience online applications or general audience mobile applications, even if login credentials created for an operator’s site, service or application may be used to access those general audience sites, services or applications.”</p>
<p>Do we just trust software app developers and companies like Google? The Electronic Frontier Foundation maintains that Google violates its privacy pledge in an FTC complaint. SIIA says this is just “misunderstandings.” <a href="http://blog.siia.net/index.php/2015/12/some-misunderstandings-of-the-student-privacy-pledge/" rel="nofollow">http://blog.siia.net/index.php/2015/12/some-misunderstandings-of-the-student-privacy-pledge/</a></p>
<p>The stakes are too high when EdTech data miners stake a claim on minors’ personal data. We have secrets. Are we entitled to a right to privacy? Should we © our identity to keep them private?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Christopher Ball</title>
		<link>http://blog.ferpasherpa.org/?p=286#comment-237142</link>
		<dc:creator><![CDATA[Christopher Ball]]></dc:creator>
		<pubDate>Wed, 20 Jan 2016 19:54:33 +0000</pubDate>
		<guid isPermaLink="false">http://blog.ferpasherpa.org/?p=286#comment-237142</guid>
		<description><![CDATA[SOPIPA restricts the disclosure by the operator of data that it gathers from others. The operator could disclose to a tutoring service, provided the tutoring service not re-disclose outside the enumerated provisions (see b(4)A of the law), if that tutoring furthers the K-12 purpose of the operator. It cannot, however, sell the info to the tutoring service. 

SOPIPA neither forbids nor entitles a parent either to access the operator&#039;s output of the data or to order that it be sent to the tutoring service.]]></description>
		<content:encoded><![CDATA[<p>SOPIPA restricts the disclosure by the operator of data that it gathers from others. The operator could disclose to a tutoring service, provided the tutoring service not re-disclose outside the enumerated provisions (see b(4)A of the law), if that tutoring furthers the K-12 purpose of the operator. It cannot, however, sell the info to the tutoring service. </p>
<p>SOPIPA neither forbids nor entitles a parent either to access the operator&#8217;s output of the data or to order that it be sent to the tutoring service.</p>
]]></content:encoded>
	</item>
</channel>
</rss>
