Wednesday, March 17, 2010

The FCC and Google want faster Internet access

After a year of public hearings and input via the Internet, the FCC has released their national broadband plan.

The FCC wants to bring 100 Mb/s download and 50 Mb/s upload speed to homes and 1 Gb/s to schools, hospitals and government buildings. They also advocate converting wireless spectrum used for TV broadcast to Internet access, with the goal of giving the US the fastest and most extensive wireless access in the world. They hope competition will lead to relatively cheap Internet access.

To put this in context, relatively low cost 100/50 Mb/s access is available in a number of cities and nations already.

More context -- Google plans to roll out 1 Gb/s fiber to between 50 and 500,000 homes in a test network.

Google hopes their test network will pressure on the FCC and the ISP industry to be more ambitious. Faster speeds mean a better Internet experience, which means more users and more Google ads. One hundred megabits per second sounds pretty good today, but it won't seem so fast in ten years.

They also hope to spur innovation. We have seen that researchers often develop applications for technology they expect to be available in the future. For example, Ivan Sutherland, shown here, built prototype image processing software with a graphical user interface in the early 1960s, using a very expensive computer. It was over twenty years before similar programs like MacDraw and AutoCad became economically viable.

Google hopes that, like Sutherland's expensive computer, their gigabit per second network will be used for new applications. They also hope to develop advanced technology for building fast networks.

How does 100 megabits per second compare to your current home Internet connectivity?

What sorts of applications would a gigabit per second connection enable?

Congress hoped to bring about competition and low prices for Internet access with the 1996 Telecommunications Act. Did the act succeed in spurring competition and lowering prices?

Thursday, March 11, 2010

Three reasons the iPad will succeed. Whoops, make that two.

The Apple iPad was announced January 27. After years of hype, people were generally disappointed. The trade press carried many articles like this one listing ten missing features. It can not play Flash movies, the aspect ratio is not 16:9, AT&T is the only carrier, the battery can not be changed, the operating system cannot multi-task, there is no camera or HDMI interface to a TV set, etc.

For me, the most important missing feature is a microphone with accompanying speech recognition software. I want to be able to input marginal notes, email addresses, etc. without typing on a glass keyboard.

In spite of all of this criticism, I expected the iPad to succeed for three reasons.

The original Macintosh had 128 KB of memory, only a floppy disk for storage, and a tiny, monochrome screen. Still, its operating system and simple applications for image and word processing had graphical user interfaces (GUIs). The earlier Apple Lisa and Xerox Star also had GUIs, but failed because they were too expensive. By the time the Mac was delivered, technology had improved, and they had engineered a minimal system that was just good enough to get people excited and succed in the market. The timing was right.

As technology advanced, Apple upgraded the Mac, adding memory and a hard drive, followed by a larger screen, color, etc.

The same will happen with the iPad. Features will be added as technology improves. I am confident that Apple has a multi-year plan for iPad improvements, and it has the potential to become a significant device for consuming content -- games, books, periodicals and video of all sorts.

I was optimistic for a second reason. Apple understands that the device is only one part of a system that includes the application store, content deals, and synchronization with the desktop and Internet. Apple learned this lesson with their ill-fated Newton, the first pocket computer. The Newton failed because the hardware was not powerful enough, and, more important, because it did not synchronize with desktop machines.

I am confident that Steve Jobs and his colleagues are negotiating with TV, movie, book, newspaper, and magazine publishers for content deals as I type this.

The third reason I was confident was that Apple had legions of software developers who, because they had developed iPhone applications, were ready to go on the iPad. Their iPhone applications would run with little or no modification on the iPad, and their programmers were up to speed on Apple's software development tools, their software development kit (SDK).

Apple might have learned the importance of the developer community by watching Microsoft. Microsoft wooed independent software vendors (ISVs) from day 1. Since the early days of MSDOS, they invited ISVs to conferences, provided them with excellent tools, set up a developer's organization, etc. Apple has taken this a step further with their application store -- they also provide a distribution channel at a reasonable cost. They removed the "V" from ISV. Independent developers were just developers, not vendors.

That is the good news (for the iPad). The bad news is that Apple seems to be blowing off the developer community. To use Apple's SDK, a developer has to agree to draconian terms. For example, they can only sell through Apple.

At some point, Microsoft could have afforded to ignore the ISV community -- Windows had a monopoly -- it was the only game in town.

Unfortunately for Apple, their developers have alternatives -- Google's Android, Microsoft's mobile version of Windows 7, and Palm's webOS. Google seems to be the biggest threat. They just released a new version of their SDK, which is provided to developers without restriction, and they offer prizes for outstanding applications.

I am still betting on the success of the iPad, but the odds have dropped. Apple's high handed attitude toward developers could be the chink in their armor.

Real Time Analytics