Victoria Zigadlo: Manifesto

1. Universities should embrace digital technologies that seek to lower the cost of education, but steer away from those that attempt to micro-manage students. (http://www.nytimes.com/2013/04/09/technology/coursesmart-e-textbooks-track-students-progress-for-teachers.html?_r=1&)

A new program called CourseSmart is now allowing teachers to track whether or not students have done the reading for their classes. Major publishers in higher education have already been collecting data from millions of students who use their digital materials. But CourseSmart goes further by individually packaging for each professor information on all the students in a class — a bold effort that is already beginning to affect how teachers present material and how students respond to it, even as critics question how well it measures learning. The plan is to introduce the program broadly this fall. I am asserting that while digital education technologies like this can be helpful, tools like CourseSmart cross a line (particularly in the realm of higher education). Students, once they reach a higher level of education, should be treated like adults who take their education seriously — regardless of if they actually do or not. Digital education technologies should seek to make learning more open and affordable, not restrictive and distrusting of those who are seeking education.

2. Parents should provide their children with access to technologies from an early age to encourage learning and innovation, but only allow unsupervised access to technologies when the child reaches their teens.

Technological toys provide children with a different learning model than traditional toys. Sherry Turkle describes this phenomenon in her observations of children who play with Furby dolls as opposed to traditional inanimate dolls. “Children say that traditional dolls can be ‘hard work,’” Turkle says, “because you have to do all the work of giving them ideas; Furbies are hard work for the opposite reason” (Turkle 39). Allowing them access to these toys will better equip them with multi-faceted skills for an ever-changing technological culture. However, to curb the negative side effects that these toys can encompass (lack of social development, blurred line between something that is alive and something that is ‘alive enough’), these toys should only be used under supervision until the child is old enough to make independent innovations with their chosen technologies.

3. Governments should enforce a policy that all publicly traded technological corporations must post and maintain a company constitution that outlines the rules governing their conduct and the conduct of their users.

David Weinberger asserts that in regards to the Internet, at least, “our task is to learn how to build smart rooms—that is, how to build networks that make us smarter, especially since, when done badly, networks can make us distressingly stupider” (Weinberger Kindle location 139). In order to ensure that we can properly judge the value of our networks, however, it is important to promote as much transparency between technological and social businesses and their consumers as possible. Since these businesses currently wield a great deal of power and have fairly little regulation, I suggest we move toward a system that holds them more accountable to their users. This is not to say there has to be a great deal of government regulation and intervention — that is, after all, usually a sure way to suspend progress. However, allowing these types of companies to self-regulate as long as they provide documentation of their customer polices — documentation that then becomes legally binding — would ensure at least that people have the ability to enter into dependent relationships with these types of businesses with a great degree of knowledge and assurance that their rights will be upheld.

4. Standards of self-regulation and fact-checking online content should be taught to school-aged children to promote more widespread, responsible Internet usage.

As Sherry Turkle states, “when media are always there, waiting to be wanted, people lose a sense of choosing to communicate” (Turkle 163). Similarly, David Weinberger explains that “the real limitation isn’t the capacity of our individual brains but that of the media we have used to get past our brains’ limitations”
(Weinberger 5). Promoting values self-regulation both in media consumption and media creation fosters a consumer base of people who are contently questioning the value of the technological media they consume on a daily basis. This can be paired in school curriculums with methods of online fact checking. Together with self-regulation, fact checking becomes extremely important to foster from an early age because it takes the edge off of the constantly updated online media cycle. If consumers are naturally tuned to double check sources and fact-check the information they are consuming and the people they trust online, there will be less likelihood of widespread or casual misuse or abuse of these online media channels.

5. Technologies should be given a trial period of at least two years before they are subject to wide-spread regulations.

The problem in our exponentially expanding technological sphere, Weinberger explains is “not information overload. It’s filter failure.” (Weinberger 28). When we fail to regulate technological advances properly, one of two things happens: technologies progress far beyond our control, or we prematurely hinder technologies that could be beneficial in ways we cannot possibly anticipate. How do we strike a happy medium between the two? I propose setting a trial period for new technologies to do just that. Technologies can be in popular production for two years before they are subject to formal regulation. This accomplished two things: first, it ensures that any and every technology has the opportunity to be reviewed before given a free pass to keep expanding. It provides a system for preemptive action that is not currently in place. It also gives technologies the appropriate amount of time to have real-world application and testing before those who fear their potential put them to bed. It gets technologies out of their experimental stages enough to be tested without causing lasting or widespread harm.

6. It should not be assumed that lack of participation in the shaping of online policy should not equate to assumed consent.
(http://www.slate.com/blogs/future_tense/2012/12/10/facebook_site_governance_vote_with_low_turnout_privacy_policy_changes_pass.html)

“That's right: It really doesn't matter to Facebook at all that nearly nine in 10 people who voted on the changes opposed them. What matters is that a far greater proportion of the site's users didn't vote at all—a declaration of apathy and/or confusion that Facebook will interpret as a license to proceed with whatever changes it sees fit from now on.”

In seeking to build smarter networks of people already overwhelmed with information in our digital age, it is not enough to simply accept lack of participation as automatic consent. Businesses like Facebook should be held accountable to their users the same way a governing body would — their rules should be clearly outlined, and to enact a change in these guidelines there should be at least a minimum percentage set for participation that is required before a decision is made.