A Service I Want

I would like an algorithm or service that would suggest arguments, opinions, and points of view from smart people trusted within their communities but with whom I am likely to disagree or whose communities I am underexposed to. I do not think I am alone in this desire.

I attempt to get some of this out of who I follow on Twitter (and it was a great use for Google Reader -- may it rest in peace), but that is a pretty imperfect system. I also routinely ask others to suggest sources I might like to fulfill these needs, but I have found that many struggle to make good suggestions.

Noble Returns to the Pavilion, from "W.G.", cricketing
reminiscences and personal recollections
Public domain book from the Internet Archive.

One of the tricky things about this algorithm or service is that it would need to distinguish between those arguments and communities that I care about, those that I do not, and those I am repulsed by. For example, I am probably underexposed to cricket enthusiasts but I don’t care much about cricket anymore and don’t want more information. Another example is that I have not read anything about the Parkland victims being actors conspiracy theories but I would be actively repulsed if a service suggested that I should read about it.

My suspicion is that one of the reasons services serve up filter-bubble content based on the engagement metrics of friend groups and similar users is because it is much easier than finding good, challenging material to suggest to users. That said, I wonder if the later might be more fulfilling to the user over the long term and result in a stickier service if it could be achieved.

Do you know of a service doing a good job of this? Do you have ideas for users or publications that would fit this bill for me? If so, please send them my way at @amac.

C.L. Townsend, Playing Forward, from "W.G.", cricketing
reminiscences and personal recollections
Public domain book from the Internet Archive.

Internet & Jurisdiction

I went to the last Internet & Jurisdiction gathering in Paris. I can’t make it to the one that starts today in Ottawa, but I would have come if I could. I’ve been thinking about the last one all year because it was full of good, smart people trying to make progress on coherent and practical Internet jurisdiction. What I also loved about it was that I came away strongly disagreeing with the direction they were going. More on that below, but first some background.

Dwarf Galaxy Caught Ramming Into a Large Spiral Galaxy
(NASA, Chandra, 08/14/13) from NASA Marshall Space Flight Center.
This and the other space images accompanying this blog post appear
to be in the public domain, in spite of NASA's weird licensey language
to the contrary.
Background: Internet Jurisdiction

Jurisdiction is one of the oldest and thorniest questions for Internet policy: “Which government(s) get to regulate what and who, where?”

As John Perry Barlow put it in his 1996 manifesto declaring the Internet’s independence from government regulation, “[Cyberspace] is a world that is both everywhere and nowhere, but it is not where bodies live.” In that piece, he argued that regulation of the Internet by governments was both unwise and impractical. Others saw it differently. As Tim Wu wrote in 1997,  “it is possible to regulate the Internet, and ... countries, corporations, organizations, and private individuals are already doing so.” The first important legal cases involving the extent of government jurisdiction over the Internet were decided shortly thereafter.

As the Internet has grown, become more mainstream, and increased in importance, particularly with respect to real world consequences that governments have historically regulated, questions of which governments get to regulate who and what online have become increasingly frequent. These questions get “answered” in courts, as governments make laws, and by corporations and individuals as the architecture, norms, markets, and regulation of the Internet develop.

There has been no straight line of consensus “progress” from one point of view to another. Even now, there are big questions that are being actively fought, including the United States Supreme Court considering Microsoft’s challenge to request by the United States for user data stored in Ireland, and the Supreme Court of Canada asserting the ability to order content removed globally only to have a U.S. District Court disagree.

The technological landscape has also changed dramatically. Over the last twenty years, as billions of people started using the Internet it has morphed from an incredibly decentralized landscape of personal websites hosted from tiny service providers, often at the very edge of the network, to a more centralized set of cloud-storage service providers serving a large percentage of the population. If the FBI wanted to find out whether I had sent an email to a particular person in 1996, they would have had to come to my house to get my computer and take a look at my locally stored email, if I hadn’t already deleted it. Today, all my email is on Google’s servers, just like that of more than a billion other people from all over the world. The public content I created used to be housed on a server in my closet. If someone thought I was saying something illegal, they would have likewise most likely have had to come to me in order to get it removed from the Internet. While it is true that, in some situations, some other avenues existed to get my information or remove my content, they were not very broadly available or used. By contrast, now, most of my online content is served from large U.S. corporations, like Google and Github. If they decide my content shouldn’t be online, they can remove it and force me to go look for another publisher. In some cases the online service providers are so important that no suitable replacement would exist.

Tarantula Nebula (NASA, Chandra, Hubble, Spitzer,
04/17/12) from NASA Marshall Space Flight Center.
Towards a coherent, if abhorrent, Internet jurisdiction policy

The Internet & Jurisdiction Conference (I&J for short) focuses on three broad tracks: data, requests for private user data; content, requests to render content inaccessible; and the internet domain name system. I’m most interested in the first two and these comments are mostly meant for them.

I&J had a wide variety of participants and many more government and law enforcement types than I generally find at the Internet policy conferences. The conversation was therefore more oriented towards those stakeholders than at some other conferences, and it was quite similar in tone to the types of conversations happening in governments and courts all over the world right now.

In both the data request and content removal areas, these conversations are moving towards a coherent, if abhorrent, policy of allowing governments almost everywhere to get data about any internet user or remove any content without needing to engage the users themselves or the court systems of their jurisdictions. Most discussions exclude certain governments from the club that should have this type of power, but the idea that data should be able to be given over and content should be able to be suppressed through interactions between governments and repeat-player intermediaries was so ingrained in many of the discussions as to be an assumption. Convenience and speed are touted as principal advantages.

For example, a Facebook user in Mexico should have their data given to authorities in the England on a request to Facebook. A Canadian Microsoft user should have their post suppressed, at least in Thailand if not all over the world, via a request to Microsoft. Even if the user is known to the complainant, no direct approach to them is contemplated. At some companies under some circumstances the user might get a notice, but that is left to the companies and to the circumstance. This is not just “All your base are belong to us” but “All your base aren’t ever belonged to you in the first place.”
Black Hole Caught in a Stellar Homicide (NASA, Chandra,
GALEX, 05/03/12)
 from NASA Marshall Space Flight Center.
A challenge: Center Internet policy on users citizens

If I were able to come again this year, I would. Indeed, my favorite conferences are those at which a majority of the attendees are smart, passionate advocates with direct experience with the subject matter and with whom I disagree (see e.g. the excellent Fordham Intellectual Property Institute). If I was there, I would challenge the attendees to propose a way forward that centers on the user rather than removing them from the equation. It is more convenient to just go to the big corporate repeat players. They are well known to the governments and can be counted on to pick up the phone. However, an Internet jurisdiction policy that regularly circumvents the user will encourage countermeasures to return power to the user -- the emerging prevalence of end-to-end encrypted services is one good example of this trend. More importantly, those users are our countries’ citizens, they deserve our respect and, at least, to be able to face their accusers and challenge the accusations. There may be cases where expediency trumps all, but this is the tiny minority of cases, not the norm on which policy should be based.

The YouTube clip above is of me trying to make a similar point at the end of last year's I&J.

PS If you want to read more about the current intermediary liability battles, please follow Daphne Keller and Eric Goldman and take a look at their excellent sets of resources on the topic at Stanford and on Eric’s blog. Graham Smith also wrote a couple of posts in the run-up to this year’s I&J.

Star Cluster Cygnus OB2 (NASA, Chandra,
11/07/12) from NASA Marshall Space Flight Center.

Screens, Images, and Attention (a thing I'm working on)

Many of us have more than a few screens. Mostly they lay idle. A tablet for plane trips, an old phone, a Chromecast connected LCD TV.

I also have a bunch of images I want to see more of. More than 100,000 personal digital photos. And, lots more incredible images out in the world. Millions.

One of the things I loved about the Obama White House was that Pete Souza and his team's wonderful photos were everywhere and changed relatively frequently. Walking to a meeting you'd see a picture of a co-worker and her kids hugging the President. In a meeting room there would be a beautiful photo of Half Dome in Yosemite and Marine One. It was great.

President Barack Obama visits with Natalie Quillian,
Advisor to the Chief of Staff, and family in the Oval Office,
 Aug. 27, 2014. (Official White House Photo by Pete Souza).
Public Domain.
I'm working on using idle screens to bring more of that into my home with my images and those from incredible online collections. The idea would be that the screens could show a playlist (automatically generated or hand-curated) of images from a wide variety of sources and on a wide variety of devices / screens. Incidentally, I haven't been able to find the equivalent of .m3u for images. I don't think it exists. It would be wonderful if people could trade image playlists.

Automatically generated playlists could show images relevant to the day of the year, or types of images, or ones that will look good on that particular screen, or ones I might like based on what I've liked before. An earlier version of this was surprisingly good if it just showed images from Christmas, New Year's and Thanksgiving.

Solar Flare, August 31, 2012, Nasa Goddard Space Flight
Center. Hosted by Wikipedia. Public Domain.
I don't know of a thing out there that does this across image collections. Do you?

I mostly program to learn things. This project will help me really learn the techniques I've been studying through the Andrew Ng's Coursera on Machine Learning. It will also brush up some of my full-stack web development skills as the database will be Mysql, the backend will be Python 3 and Django 2, the frontend will be HTML and CSS with a lot of Javascript (including ajax, which I haven't used too much). And, I'll get to know both JQuery and Cycle 2 well. It has been fun so far and I will share updates as I go, and, eventually, the code.

I'm not doing this as an "entrepreneur." The idea isn't to make a bunch of money or get a million users. Mostly I'm doing this for myself and to learn. I have found that deeply understanding technology is really helpful to my legal and policy work (not to mention my life). I also really enjoy coding for fun. If you haven't tried that, you should!

I'm writing about it here to further commit to finishing it and so that others can share their good ideas.

153rd New York Infantry, ca. 1861, from Metropolitan Museum
of Art. Hosted by the Internet Archive. Public Domain.
If you are interested in using or working on this, please let me know. I'd be interested in understanding other use cases. I would also love pointers to great image repositories (preferably public domain). And eventually, I'd like people to help me rate the pics from some of the public repositories. My email is "lawyer" @ the popular email service run by Google. Or you can @reply me on twitter @amac.

Recap & Response to a Thread on Speech

Sometimes a Twitter thread is easier to read as a blog post.

The below was originally posted on Twitter.

1) Good thread by @yonatanzunger with a bunch of useful truths. Recap & comments from me below.

2) Speech can be used as a weapon against other speech: https://twitter.com/yonatanzunger/status/914609013722984448
See also @superwuster arguing that the 1st Am is obsolete in an era of attention scarcity.

Fight between Rioters and Militia, from Pen and Pencil Sketches of the Great RiotsImage in the Public Domain.

3) People bear diff costs of bad speech & harassment, disadvantaged often most affected:

4) Understanding & combating speech that reduces engagement can further a speech maximizing policy goal:

5) Having + stating an “editorial voice,” gestures, public perception & examples also can be important:

The Frame, from TypographiaImage in the Public Domain.

6) Also, he gives great pointers to smart folks in the online community field:
And of course there are many more, incl: Heather Champ, @juniperdowns, Victoria Grand, Monika Bickert, Shantal Rands, Micah Schaffer, @delbius, @nicolewong, @zeynep, @zephoria, @StephenBalkam, @unburntwitch, @noUpside, @EthanZ,  @jessamyn, @sarahjeong + many many more incl great non-US folk. And including the folks & orgs on the various advisory councils:
https://www.facebook.com/help/222332597793306/ (and others)
As @yonatanzunger says, this work is a team sport that advances with help from all around.

7) I have some Qs re his 47 USC §230 (CDA) points. I don't know a case of something like his “editorial voice” breaking immunity or otherwise causing a “huge legal risk.” Indeed that was the point of §230 originally. So, asking experts: @ericgoldman & @daphnehk what do you think?

8) Also, I don’t think “maximizing speech” is quite the right goal or that every service should have the same goal. I want something different when I go to Facebook v Twitter v YouTube.
Also, I want more than one good service whose arch + policies (and, sure, “editorial voice”) support an extremely wide diversity of views being able to flourish, be expressed well & be easy to find & interact with including from outside social circles. But your mileage may vary.

9) Naturally, I also disagree that Twitter folks (including me) “never took [these issues] seriously,” provided “bullshit” explanations, were naive, and chased traffic over good policy. Was there & think I'd know.
But, taking that sort of beating is kinda part of the job. And, maybe I’m too biased from working & learning these issues at platforms incl many at Google, Twitter & in govt w/ @POTUS44.

10) Anyhow, I’m very glad @yonatanzunger chose to post this thread to Twitter & I hope the suggestions part is read widely.

Printing Press, from Typographia. Image in the Public Domain.

Google Location History to Country Chart

I had some spare time, so I knocked out another rough and ready set of scripts that I've been meaning to code for a while (see also DenseDead). These scripts, written in python with help from the Google Geocoding API, Google Charts, and the GeoPy module will give you a map of the world with countries colored based on how many years it has been since you visited. Mine looks like this (I added some data by hand for before 2012):

It is kind of kludgy, but in case you are curious, the steps and the scripts are below.

One of the reasons I wanted to write these is because the Google information is awesome but both too detailed for what I'd like to keep lying around, and not useful to me as latitude/longitudes. Instead, I'd like to know the countries I've been to over time. With these scripts, I convert the timestamps and latitude/longitudes into timestamps and addresses before fuzzing them down to years and countries, which is immediately useful and about the level that I want to keep. If you no longer find Google's use of its more fine-grained information useful, you can also clear the more detailed information from Google (instructions to delete your location history).
  1. Go to Google Takeout and download a KML of your location history. Allow me one small digression here.
    Google Takeout exists because of a relatively small group of engineers and others at Google who worked hard to make it exist. The team used to go by the name of the Google Data Liberation Front and have a really cool website and logo. The website now redirects to a support article but there are still great people working at Google working on ensuring that users have access to their data. This type of data portability is extremely important and ensuring it is something I worked on in the private and public sector. Thank you to the current and former team members and allies of the Data Liberation Front!
  2. Run reduce_kml.py to reduce the number of KML entries to a more manageable number. I threw out all entries that are within 10miles of the last entry I counted. The commandline is:
    #python reduce_kml.py Google_Location.kml > outfile.kml
  3. Split up the resulting file into chunks so that you don't violate the Google Geocoding API's daily limit. I used "#split -l4000 output_from_reduce.kml infile" that will produce 2000 calls to the Google API per file.
  4. Run get_addresses.py on each of the split files. Do one per day so as not to get blocked. The commandline is:
    #PYTHONIOENCODING=utf-8 python get_addresses.py input_file.kml
    Note that I needed to specify the encoding because although I think I understand encoding well enough, I don't. If someone else wants to teach me how to fix that, I would love to know.
  5. Join the outputfiles ("#join infileaa infileab infileac > infile_join"). 
  6. Optional: Fuzz the joined outputfiles by using fuzz_addresses.py:
    #python fuzz_addresses.py inputfile.csv outputfile.csv
  7. Run make_country_chart.py to create an HTML file that will include the javascript for the chart. The commandline is:
    #python make_country_chart.py input_file.csv > Country_Chart.html
You may download the scripts at amac0/google-location-tools and the most interesting are also below.


First Time in Government

“The President of the United States is going to call you in three hours to offer you the job, so I need to know in two whether you will say yes because we do not surprise the President.” That’s what Todd Park, U.S. Chief Technology Officer (CTO), said to me in August of 2014 as our family was about to head back to San Francisco for the new school year.
Todd Park, Assistant to the President and Chief Technology Officer
shows President Obama information on a tablet April 15, 2013.
Official White House Photo by Pete Souz
In those two hours, I tried to figure out if I could really make a significant positive impact in the job the President would offer me as Deputy U.S. CTO. And, if so, whether that was worth moving our family.  The ability to make a positive impact is generally my north star when trying to make job decisions, but time and again, when I am looking back on whether taking a job was the right decision, the quality of the team I got to work with is always most important. Now that I’ve had some time to reflect on my time in government, and the entire Obama team has moved on from team CTO, I know that this time was no different. While I am extremely grateful for the impact of the work I was privileged to do, I am most happy about my time in government because of the the people I got to do it with.

The Eisenhower Executive Office Building, home of the
U.S. CTO, pictured c 1907, then the State, War & Navy Building
Gall, George, Washington: The Capital of the Nation (1907).
Digitized by The Internet Archive from the Library of Congress

The impact of government work was amazing. Our purpose was clear: help make life better for and with the American people. Under President Obama, Team CTO had significant impact working together along with many others in our home at the Office of Science and Technology Policy (OSTP), elsewhere in the White House, and across the Federal Government. We supported the work done by previous CTO teams to bring tech capacity to government in the form of the Presidential Innovation Fellows, 18F, U.S. Digital Service, and revamped Office of Digital Strategy. We brought more data science and data scientists into government through the creation of the U.S. Chief Data Scientist team and role, and creating a data science cabinet. We expanded data collaborations for solutions in justice, jobs, housing, education, and more, while continuing to get more government data out to the public. The Open Government Partnership (OGP) continued to thrive and grow and we shepherded the U.S. Open Government National Action Plans while teaming up with the National Security Council and State Department to help lead the U.S. OGP. With support from the Chief of Staff, we created a new tool for tech policy making called the Tech Policy Task Force -- which added “TQ” to many policy tables, formulated a federal source code policy, moved the government forward on artificial intelligence and uncrewed aerial vehicles, accelerated open educational resources, highlighted the opportunities and challenges of big data and algorithmic decision making, and worked with the Departments of Education, Transportation, Commerce, State, Homeland Security, Justice, and others to help regulations get out of the way of innovation while protecting people’s rights and lives. We pushed for greater recognition of all American talent, including women and underrepresented minorities in STEM; catalyzing for improvements in the portrayal of STEM people in media; expanding inclusive opportunity in computer science education; ensuring outreach for jobs in innovative industries in hiring programs, such as TechHire; increasing Internet connectivity in the U.S. and around the world; championing innovative local community solutions; pushing through more diversity and inclusion in the Federal government workforce; and expanding best practices in organizations and companies in diversity, equity, and inclusion, including an implementation action grid, the Tech Inclusion Pledge, and expanded inclusive venture funding.  And, lots more. Even reading the list brings a tired smile to my face.

President Obama writes his first line of code and celebrates
with the middle school student who helped teach him, Dec. 8 2014.
Official White House Photo by Pete Souza
So, while there was no shortage of impact, I still would say that it was the people that were the most important reason why I was so glad that I had the opportunity to work government. The diversity of people in the White House and at agencies was a huge difference from Silicon Valley. That diversity was expressed in terms of the traditional lines of socio-economic, ethnicity, race, color, religion, age, disability status, gender identity, sexual orientation, and far more balanced gender representation, but also in terms of point of view, educational, geographic origin, and career background. Sometimes, I was among others like me but more frequently I was unusual along a number of dimensions such as my lack of long federal service, my lack of military service, my tech background, etc. I came away extremely impressed with the level of experience, intellect, and passion that the Obama White House was able to attract. There were people who you later discovered were Rhodes Scholars, or Supreme Court clerks, or had beaten a Scrabble world champion over the weekend. I had expected both the diversity and excellence, but it is one thing to expect something and quite another to live it for two-plus years.

President Obama talks with U.S. CTO Megan Smith, and
OSTP Director Dr. John Holdren, Oct. 8, 2014.
Official White House Photo by Pete Souza
Team CTO under U.S. CTO Megan Smith was also outstanding. I had the privilege of working with a number of people that I have admired and wanted to work for for years. I also got to meet and work with a bunch of folks that I might never have otherwise met. Over my time there, the U.S. CTO team while I was there included at different times: Puneet Ahira, Seth Andrew, Rob Bacchus, Jake Brewer, Marvin Carr, Jimmy Catania, Colleen Chien, Evan Cooke, R. David Edelman, Ed Felten, Anjali Fernandes, Brian Forde, Brianna Fugate, Dipayan Ghosh, Vivian Graubard, Renee Gregory, Dan Hammer, Natalie Evans Harris, Read Holman, Kristen Honey, Mina Hsiang, Kelly Jin, Terah Lyons, Matthew McAllister, Dawn Mielke, Lynn Overmann, Ryan Panchadsaram, DJ Patil, Tom Power, Laura Weidman Powers, Jason Schultz, Nick Sinai, Lauren Smith, Ashkan Soltani, Suhas Subramanyam, Emily Tavoulareas, Maya Uppaluru, Aden Van Noppen, Nancy Weiss, Claudia Williams, Charles Worthington, Cori Zarek, and, of course, the wonderful Megan Smith herself.

Some of the wonderful folks that made up team CTO,
Jan. 14, 2017.

That combination of a top-notch team amid a diverse broader group of excellent folks from top to bottom at OSTP, the broader White House and across the Government, made going into work both a joy and a challenging learning experience every day. I felt like I grew a ton, learned a lot about how the U.S. government functions, and picked up some really interesting management and leadership lessons from the people I got to work with. I came home mentally worn out but almost always smiling. I also made a bunch of new friends.

President Obama talks with Girl Scout White House Science
Fair participants who had designed a Lego page turner to help
people read books who may not otherwise be able, Mar. 23 2015.
Official White House Photo by Pete Souza

Now that I have been out of work for a few months, I am thankful for the opportunity to make a positive impact but I am certain that I made the right choice that August because of the amazing people I got to work with. I am still fired up that I had the privilege to serve with each and every one of them, and ready to go and work with them again!

President Obama, listens during a technology 
strategy discussion, Oct. 8, 2014.

Dense Dead

If there is some interest, I'll publish something longer on this. In the meantime, I threw together a quick script that takes a link to one of the wonderful Grateful Dead shows available to stream from the Internet Archive and edits out the least dense of Dead songs, Drums and Space, so that when I am listening to the Dead and working, I don't need to skip them.

The result is DenseDead.com and a chrome extension that will rewrite Grateful Dead Internet Archive pages to push the m3u files through Dense Dead.

For example, to hear a slightly denser version of The Grateful Dead's last show, the Internet Archive URL would be:
the streaming URL on that page is:
and the DenseDead URL would be:
If you visit the Internet Archive URL for the show with the Dense Dead chrome extension installed, clicking on the VBR Stream Playlist link will automagically give you the denser m3u file even though the page itself still shows the full show.

If you find shows where Dense Dead doesn't work, hit me up on Twitter @amac and I'll see if I can fix it. If you want to know more about how I did this, also shoot me an @reply over on Twitter and I'll consider writing more.

Public domain image of skull from An Illustrated System of Human Anatomy: Special, General and Microscopic,
Samuel George Morton (1849) scanned by Google Books from Oxford University Library


Samuel George Morton (1849) scanned by Google Books from Oxford University Library