Glancing through the FCC proposal, it does indeed look consumer friendly, and so is likely to be fought tooth and nail by the broadband providers. So I encourage you to write a supportive note to the FCC – either on the ACLU link above or directly with the FCC.
Archive for the ‘Policy’ Category
Euclid’s idea is to provide Google Analytics-style information on foot traffic in retail stores. They implement it using the Wi-Fi on smart phones. This is technologically trivial: if you leave the Wi-Fi on your phone turned on, it will periodically transmit Wi-Fi packets, for example ‘probe requests.’ Every packet transmitted by a device contains a unique identifier for that device, the MAC address. So by gathering this information from a Wi-Fi access point, Euclid can tell how often and for how long each device is in the vicinity. Presumably enough people have Wi-Fi on their phones by now to gather statistically representative data for analytics purposes.
The Euclid technology doesn’t require your opt-in, and it doesn’t need to be tied to Wi-Fi. The concept can trivially be extended to any phone (not just Wi-Fi equipped ones) by using cellular packets rather than Wi-Fi, and for people with no phone, face recognition with in-store cameras. For this kind of application even 90% accuracy on the face recognition would be useful.
One of the only four choices on Euclid’s website’s navigation menu is Privacy. Privacy gets this prominent treatment because the privacy issues raised by this technology are immense.
Gathering this kind of information for one store – anonymous traffic by time, duration of stay, repeat visits and so on doesn’t seem too intrusive on individuals, but Euclid will be tempted to aggregate it across all the stores in the world, and to correlate its data with other data that stores already gather, like point of sale records.
Many technology sophisticates I talk with tell me that it is naive to expect any privacy whatsoever in the Internet age, and I guess this is another example. Euclid will effectively know where you are most of the time, but it won’t know much more than your cellular provider, or any any of the app vendors to whom you have given location permission on your phone.
I will be moderating a panel on this topic at ITExpo East 2012 in Miami at 3:00pm on Thursday, February 2nd.
The pitch for the panel is:
The FCC has proposed a date of 2018 to sunset the Public Service Telephone Network (PSTN) and move the nation to an all IP network for voice services. This session will explore the emerging trends in the Telco Cloud with case studies. Learn how traditional telephone companies are adapting to compete, and new opportunities for service providers, including leveraging cloud computing and Infrastructure as a Service (IaaS) systems that are being deployed with scalable commodity hardware to deliver voice and video services including IVR, IVVR, conferencing plus Video on Demand and local CDNs.
In related news, a group of industry experts is collaborating on a plan for this transition. The draft can be found here. I volunteered as the editor for one of the chapters, so the current outline roughs out some of my opinions on this topic. This is a collaborative project, so please contact me if you can help to write it.
TV White Spaces Second MO&O: A Second Memorandum Opinion and Order that will create opportunities for investment and innovation in advanced Wi-Fi technologies and a variety of broadband services by finalizing provisions for unlicensed wireless devices to operate in unused parts of TV spectrum.
Early discussion of White Spaces proposed that client devices would be responsible for finding vacant spectrum to use. This “spectrum sensing” or “detect and avoid” technology was sufficiently controversial that a consensus grew to supplement it with a geolocation database, where the client devices determine their location using GPS or other technologies, then consult a coverage database showing which frequencies are available at that location.
Among the Internet companies this consensus now appears to have evolved to eliminate the spectrum-sensing requirement for geolocation-enabled devices.
The Register says that this is because spectrum sensing doesn’t work.
The Associated Press predicts that the Order will go along with the Internet companies, and ditch the spectrum sensing requirement.
Some of the technology companies behind the white spaces are fighting a rearguard action, saying there are good reasons to retain spectrum sensing as an alternative to geolocation. The broadcasting industry (represented by the NAB and MSTV) want to require both. It will be interesting to see if the FCC leaves any spectrum sensing provisions in the Order.
Google and Verizon came out with their joint statement on Net Neutrality on Monday. It is reasonable and idealistic in its general sentiments, but contains several of the loopholes Marvin Ammori warned us about. It was released in three parts: a document posted to Google Docs, a commentary posted to the Google Public Policy Blog, and an op-ed in the Washington Post. Eight paragraphs in the statement document map to seven numbered points in the blog. The first three numbered points map to the six principles of net neutrality enumerated by Julius Genachowski [jg1-6] almost a year ago. Here are the Google/Verizon points as numbered in the blog:
1. Open access to Content [jg1], Applications [jg2] and Services [jg3]; choice of devices [jg4].
2. Non-discrimination [jg5].
3. Transparency of network management practices [jg6].
4. FCC enforcement power.
5. Differentiated services.
6. Exclusion of Wireless Access from these principles (for now).
7. Universal Service Fund to include broadband access.
The non-discrimination paragraph is weakened by the kinds of words that are invitations to expensive litigation unless they are precisely defined in legislation. It doesn’t prohibit discrimination, it merely prohibits “undue” discrimination that would cause “meaningful harm.”
The managed (or differentiated) services paragraph is an example of what Ammori calls “an obvious potential end-run around the net neutrality rule.” I think that Google and Verizon would argue that their transparency provisions mean that ISPs can deliver things like FIOS video-on-demand over the same pipe as Internet service without breaching net neutrality, since the Internet service will commit to a measurable level of service. This is not how things work at the moment; ISPs make representations about the maximum delivered bandwidth, but for consumers don’t specify a minimum below which the connection will not fall.
The examples the Google blog gives of “differentiated online services, in addition to the Internet access and video services (such as Verizon’s FIOS TV)” appear to have in common the need for high bandwidth and high QoS. This bodes extremely ill for the Internet. The evolution to date of Internet access service has been steadily increasing bandwidth and QoS. The implication of this paragraph is that these improvements will be skimmed off into proprietary services, leaving the bandwidth and QoS of the public Internet stagnant.
The exclusion of wireless many consider egregious. I think that Google and Verizon would argue that there is nothing to stop wireless being added later. In any case, I am sympathetic to Verizon on this issue, since wireless is so bandwidth constrained relative to wireline that it seems necessary to ration it in some way.
The Network Management paragraph in the statement document permits “reasonable” network management practices. Fortunately the word “reasonable” is defined in detail in the statement document. Unfortunately the definition, while long, includes a clause which renders the rest of the definition redundant: “or otherwise to manage the daily operation of its network.” This clause appears to permit whatever the ISP wants.
I got an email from Credo this morning asking me to call Julius Genachowski to ask him to stand firm on net neutrality.
The nice man who answered told me that the best way to make my voice heard on this issue is to file a comment at the FCC website, referencing proceeding number 09-191.
So that my comment would be a little less ignorant, I carefully read an article on the Huffington Post by Marvin Ammori before filing it.
My opinion on this is that ISPs deserve to be fairly compensated for their service, but that they should not be permitted to double-charge for a consumer’s Internet access. If some service like video on demand requires prioritization or some other differential treatment, the ISP should only be allowed to charge the consumer for this, not the content provider. In other words, every bit traversing the subscriber’s access link should be treated equally by the ISP unless the consumer requests otherwise, and the ISP should not be permitted to take payments from third parties like content providers to preempt other traffic. If such discrimination is allowed, the ISP will be motivated to keep last-mile bandwidth scarce.
Internet access in the US is effectively a duopoly (cable or DSL) in each neighborhood. This absence of competition has caused the US to become a global laggard in consumer Internet bandwidth. With weak competition and ineffective regulation, a rational ISP will forego the expense of network upgrades.
ISPs like AT&T view the Internet as a collection of pipes connecting content providers to content consumers. This is the thinking behind Ed Whitacre’s famous comment, “to expect to use these pipes for free is nuts!” Ed was thinking that Google, or Yahoo or Vonage are using his pipes to his subscribers for free. The “Internet community” on the other hand views the Internet as a collection of pipes connecting people to people. From this other point of view, the consumer pays AT&T for access to the Internet, and Google, Yahoo and Vonage each pay their respective ISPs for access to the Internet. Nobody is getting anything for free. It makes no more sense for Google to pay AT&T for a subscriber’s Internet access than it would for an AT&T subscriber to pay Google’s connectivity providers for Google’s Internet access.
The FCC has launched a broadband speed and quality test presumably to gather information about the real state of broadband in the US. This is a great initiative and I encourage you to go and run the test.
I tried it myself, and it wouldn’t let me take the test because I put in an English zip code, because that is where I happen to be this week. So I put in my Dallas zip code instead and it ran the test. I hope that there is some check in there that compares the zip code to the geographical location of my IP address, and discards the ones that don’t match, or the results will presumably be worthless.
[Update:]Lauren Weinstein mentions this issue, as well as several others…
It’s like buying an airplane ticket then getting charged extra to get on the plane.
The cellular companies want you to buy cellular service then pay extra to get signal coverage. Gizmodo has a coolly reasoned analysis.
AT&T Wireless is doing the standard telco thing here, conflating pricing for different services. It is sweetening the monthly charge option for femtocells by offering unlimited calling. A more honest pricing scheme would be to provide femtocells free to anybody who has coverage problem, and to offer the femtocell/unlimited calling option as a separate product. Come to think of it, this is probably how AT&T really plans for it to work: if a customer calls to cancel service because of poor coverage, I expect AT&T will offer a free femtocell as a retention incentive.
It is ironic that this issue is coming up at the same time as the wireless carriers are up in arms about the FCC’s new network neutrality initiative. Now that smartphones all have Wi-Fi, if the handsets were truly open we could use our home Wi-Fi signal to get data and voice services from alternative providers when we were at home. No need for femtocells. (T-Mobile@Home is a closed-network version of this.)
Presumably something like this is on the roadmap for Google Voice, which is one of the scenarios that causes the MNOs to fight network neutrality tooth and nail.
Google and the New America Foundation have been working together for some time on White Spaces. Now they have (with PlanetLab and some academic researchers) come up with an initiative to inject some hard facts into the network neutrality debate.
The idea is that if users can easily measure their network bandwidth and quality of service, they will be able to hold their ISPs to the claims in their advertisements and “plans.” As things stand, businesses buying data links from network providers normally have a Service Level Agreement (SLA) which specifies minimum performance characteristics for their connections. For consumers, things are different. ISPs do not issue SLAs to their consumer customers. When they advertise uplink and downlink speeds, these speeds are “typical” or “maximum,” but they don’t specify a minimum speed, and they don’t offer any guarantees of latency, jitter, packet loss or even integrity of the packet contents. For example, here’s an excerpt from the Verizon Online Terms of Service:
VERIZON DOES NOT WARRANT THAT THE SERVICE OR EQUIPMENT PROVIDED BY VERIZON WILL PERFORM AT A PARTICULAR SPEED, BANDWIDTH OR DATA THROUGHPUT RATE, OR WILL BE UNINTERRUPTED, ERROR-FREE, SECURE…
Businesses pay more than consumers for their bandwidth, and providing SLAs is one of the reasons. Consumers would probably not be willing to pay more for SLAs, but they can still legitimately expect to know what they are paying for. The Measurement Lab data will be able to confirm or disprove accusations that ISPs are intentionally impairing traffic of some types.
This is a complicated issue, because one man’s traffic blocking is another man’s network management, and what a consumer might consider acceptable use (like BitTorrent) may violate an ISP’s Acceptable Use Policy (Verizon:”…it is a violation of… this AUP to… generate excessive amounts of email or other Internet traffic;”). The arguments can go round in circles until terms like “excessive” and “unlimited” are defined numerically and measurements are made. So Measurement Lab is a great step forward in the Network Neutrality debate, and should be applauded by consumers and service providers alike.
This is incredible news. The FCC has done a wonderful thing, standing up to the broadcast TV lobby to benefit the people of America. What’s even better, four of the five commissioners are enthusiastically behind the decision:
It has the potential to improve wireless broadband connectivity and inspire an ever-widening array of new Internet based products and services for consumers. Consumers across the country will have access to devices and services that they may have only dreamed about before.
Some have called this new technology “Wi-Fi on steroids” and I hope they are right. Certainly, this new technology, taking advantage of the enhanced propagation characteristics of TV spectrum, should be of enormous benefit in solving the broadband deficit in many rural areas.
Today the Commission takes a critically important step towards managing the public’s spectrum to promote efficiency, and to encourage the development and availability of innovative devices and services.
While new broadband technologies are the most likely uses of these channels, the most exciting part about our action today is that we are creating the opportunity for an explosion of entrepreneurial brilliance. Our de-regulatory order will allow the market place to produce new devices and new applications that we can’t even imagine today.
The fifth commissioner, Deborah Taylor Tate, is only partly on board – she thinks some of this spectrum should be licensed, and she is concerned that not enough provision has been made for remediation in the event that interfering radios are deployed.
The FCC decision is a bold one – a more conservative positive decision would have been to approve a rural broadband access-only (802.22-style) use for now, but the commissioners went ahead and approved personal/portable use as well, which is what Google, Microsoft and numerous other computer and Internet industry companies have advocated.
The ruling imposed a geolocation requirement which will vastly increase the market for GPS silicon, though the trend in embedded GPS is to include GPS on the same die as other radios (like Bluetooth or cellular baseband) so whoever makes the White Spaces radio chips will probably be putting GPS on the same die by the second product generation.
The digital TV transition will open up the White Spaces spectrum in February 2009, but I will be very surprised if any white spaces consumer products appear in the market before 2010.