This was inadvertently posted before it was finished -- I will post the final version in a day or two.
Tuesday, February 02, 2016
Tuesday, January 19, 2016
We all need to be aware of the problem of bit rot in our work.
One of my favorite podcasts, OnTheMedia, produced a program called Digital Dark Age (52:14) last year. It consists of several segments on the problems of protecting and archiving the vast amounts of data we are generating. If you don't have time to listen to the entire podcast, at least check out two segments, interviews of Vint Cerf on information preservation problems (6:28) and Nick Goldman on using DNA as a storage medium (9:02).
Let's start with the Goldman interview. He notes that applications like video entertainment and scientific research are generating immense amounts of data, which is already overwhelming today's optical and magnetic storage media. Goldman is experimenting with DNA as a very high capacity, long lived data storage medium. How might that work?
A strand of DNA is made up of strings four bases abbreviated A, T, C and G, as shown here:
|The DNA strand is like a twisted ladder where|
the "rungs" are either A-T or C-G bonds. (For
more, see this animation).
Biologists have developed equipment for sequencing (reading) the list of bases making up a strand of DNA and for synthesizing arbitrary strands of DNA. That makes it possible to store a copy of a binary file in a strand of synthesized DNA, for example by synthesizing strands of DNA in which binary 0s are represented by an A or C base and 1s are represented by a T or G. A DNA sequencer could then convert those A, T, C and Gs back into 1s and 0s.
Goldman thinks we will see expensive DNA storage devices in three or four years and they will be cheap enough for consumer storage in 10-15 years. DNA stored in cool dry places will last hundreds of thousands of years and "all the digital information in the whole world everything that's connected to the Internet" will fit in "the back of a minivan." If Goldman falters there are other DNA storage projects at Harvard (article, video) and Microsoft.
But even if DNA or some other storage technology gives us dense, cheap storage, there are other problems, as outlined by Vint Cerf.
Cerf talks about "bit rot." The simplest type of bit rot is media deterioration -- becoming unreadable after 20 0r 30 years. That can be overcome by creating a new copy periodically, but that is not be enough. Let me give you a personal example.
In 2004 my wife had a small hole in the atrial wall of her heart repaired. In a 30 minute, outpatient procedure a skilled surgeon installed a small device in which tiny umbrellas were clamped over the hole, held there by a spring.
|A ballon determines hole diameter (left), the device in place (right)|
That was pretty amazing, so I asked for a video of the procedure, which was provided on an optical CD. The CD also included a program for viewing the video, the Ecompass CD Viewer. The CD media might be somewhat deteriorated by now, but I have transferred the program and data to magnetic storage, so I can still view it on my laptop.
But, my laptop is running Windows 7. The version of Ecompass CD Viewer I have was written in the Windows XP days. It still works, but will it be compatible with Windows 10 or later? If not, I could upgrade to a current version, but, as far as Google Search knows, Ecompass CD Viewer is no longer with us -- perhaps the company went bankrupt or dropped the program.
Ecompass CD Viewer works by stringing together clips stored in a video file format called SSM. If I could find an SSM viewer, maybe I could cobble together the entire video since I have the SSM clips. That might work for these static files, but we need the functionality of the original program for something that a user can interact with, like a spreadsheet. Looking further in the future Windows will disappear along with Intel-processor machines.
Cerf does not see a complete solution to the bit rot problem, but he pointed to Project OLIVE at Carnegie Mellon University as a significant step in the right direction. OLIVE emulates old computers running old software on virtual machines. Below, you see an image from a demonstration of an emulation of a 1991, OS7 Macintosh running Hypercard. The display, mouse and keyboard of the Dell PC are interacting with the simulated Macintosh, which is running on an Internet server.
|Still image from a Project OLIVE demo video.|
The OLIVE demonstration is impressive, but it is a research prototype that assumes standard input/output devices and capturing the vast number of programs and hardware configurations that exist today would require a massive effort. Similarly, DNA storage is at the early proof-of-concept stage. Both feel like longshots to me, so, for now, we all need to be aware of the problem of bit rot in our work.
Monday, December 21, 2015
Will Internet news go the way of newspapers?
December 17 was the anniversary of President Obama's call for change in our Cuba policy. That milestone sparked a lot of coverage in the mainstream press, which I discussed in a post on my blog on the Cuban Internet.
The most extensive Cuba coverage I saw was a week-long series of posts on Yahoo -- U. S. and Cuba, One Year Later. The series has many well written posts on various aspects of Cuban culture and the political situation. Most are human interest stories on tourism, fashion, baseball, etc., but several were Internet-related.
The posts are not detailed or technical, but they are well written for a general audience -- like newspaper readers. (Remember newspapers)?
That is the good news.
The bad news is that the posts are overrun by annoying ads and auto-play videos. This is illustrated by the series "home page," shown below.
|The series table of contents -- can you spot the ads?|
Most of the elements in this array of phone-sized "cards" consist of an image from and link to a story on Cuba, but, if you look carefully, you will see that several of them link to sneaky ads. It's like Where's Waldo -- can you spot the ads?
In an earlier post, I suggested that advertising-based, algorithm-driven Internet news might be increasingly redundant and concentrated in high-volume sites. The advertising revenue is used to pay talented writers, photographers and videographers who are capable of producing timely news coverage of a story like this one.
But, people are fed up with those ads and increasingly deploying ad blockers.
|Reasons people turn to ad blockers|
People are circulating manifestos and Google and Facebook are proposing standards to improve the advertising and speed of the Web, but will those moves cut Yahoo's revenue?
Furthermore, mainstream media like Yahoo rely to some extent on specialized, "long tail" sources, like my Cuban Internet blog. Yahoo interviewed me (and many others) and took material from some of my posts in preparing their coverage. But, will specialized blogs and sites continue to exist? Losing focused sources would increase the cost of mainstream media stories.
I really liked Yahoo's coverage of Cuba one year later, but I wonder if they will be around for Cuba five years later.
ASUS will include the AdBlock Plus ad blocker in their proprietary browser. Apple is now allowing ad blockers on iOS devices. Will Firefox be next? Microsoft Edge?
Thursday, December 10, 2015
Will Google free me from the evil clutches of the dreaded Time Warner Cable?
Google's first foray into municipal networking was connecting 12 square miles of Mountain View California in 2007. In 2010 they issued a call for proposals from cities wishing to participate in an "experiment" called Google Fiber, which would offer symmetric, 1 Gbps connectivity to customers. In 2012, Kansas City was selected as the first Google Fiber city.
But, was it an experiment? An attempt to goad ISPs to upgrade their networks? The start of a new Google business? In 2013, Milo Medin, who was heading the Google Fiber project, said that they intended to make money from Google Fiber and that it was a "great business to be in."
Today, Google Fiber is operating in three cities and they are committed to installing it in six others. Eleven cities, including Los Angeles and Chicago, have been invited to apply.
|Google is considering big cities Los Angeles and Chicago.|
Los Angeles and Chicago were just added to the list and it is significant that they are the first very large cities -- both in population and area -- on the list.
Since the initial installation in Kansas City, Google has codified the city-selection process in an informative checklist document. Google knows they are offering a service that will benefit the city in many ways, so the checklist is essentially the guide to an application form in which the city has to offer access to poles and tunnels, 2,000 square-foot parcels for equipment "huts," fast track permitting, etc.
I expect that Google will also have their eye on the Los Angeles tech startup community and entertainment industries. While Google Fiber does not seem to be a mere "experiment," they will doubtless enable and discover new applications that captialize upon gigabit connectivity (and increase Google ad revenue).
Rollout order within a selected city is governed by the willingness of residents of a neighborhood to sign up for the service. High demand areas get high priority. But, this can exacerbate the digital divide within the city -- serving wealthy areas before poor areas. Google encountered this problem in Kansas City. As shown below, wealthy neighborhoods (green) committed before the poorer areas, so Google initiated programs to reach out to them.
|Wealthy KC neighborhoods committed early.|
Based on that experience, they now consider inclusion plans in the application process and hire city-impact managers for fiber cities. They also offer very low-cost copper connections for those who cannot afford fiber.
I am not familiar with the situation in Chicago, but Los Angeles has been pursuing fiber connectivity for some time. The city issued a request for proposals for city-wide fiber two years ago, and last year CityLinkLA was formed with the goal of providing "basic access to all for free or at a very low cost and gigabit (1 Gbps) or higher speed access at competitive rates." The effort has been led by Los Angeles Mayor Eric Garcetti and Councilman Bob Blumenfield and they are working with both Google and AT&T toward that goal.
I assume that AT&T will upgrade their current infrastructure to DOCSIS 3.1 in order to achieve faster speeds over copper running from fiber nodes to individual premises, but they only serve a portion of Los Angeles. Other areas may have to wait for Google. It seems that Verizon gave up on their fiber offering, FIOS, some time ago.
Now for the belated full-disclosure. I live in Los Angeles, and am hoping that competition between Google or AT&T or someone will one day free me from the evil clutches of my current monopoly broadband service provider, the dreaded Time Warner Cable.
Friday, December 04, 2015
The recent LA Startup Week reminded me of the early days of personal computing and of the meetup that may have initiated the LA startup community.
When personal computers were getting off the ground in the 1970s, hobbyists and budding entrepreneurs met at places like the Homebrew Computer Club in Palo Alto or the Southern California Computer Society in Los Angeles to talk about what they were doing and to help each other out. There was idealistic, shared enthusiasm – we knew PCs were totally cool and were a "big deal."
The line between competition and cooperation among entrepreneurs was fuzzy. So was the line between entrepreneurs and hobbyists. Competitors met for beers after meetings to talk about the boards they were designing and I recall Steve Wozniac offering copies of the schematic and parts list for his wire-wrapped computer at the Homebrew club while Steve Jobs was behind a card table offering to sell them assembled.
The early Internet days felt the same – the Net was clearly revolutionary and it was born in an academic community with a culture that valued sharing and collaboration. The National Science Foundation backbone network (NSFNET) was established "to support open research and education in and among US research and instructional institutions" – for-profit traffic was not acceptable. (NSF also established an International Connections Program, which subsdised NSFNET links from academic and research networks in other nations).
The LA startup scene
In April, 2012, I attended the first meet-up of Tech Entrepreneurs of Venice, CA. It felt like the early PC days. There were no business suits or fancy offices. As you see below, we met in a warehouse (thanks to MovieClips), and stood around drinking beer and talking. But, the attendees were not hobbyists and the focus was on business opportunities, not research and education.
Venice is a small part of Los Angeles, and might have seemed an unlikely place for the birth of the next "Silicon X," but it has a history of avant garde community and culture.
Venice was the heart of the LA beatnik community in the 1950s and the counter culture of the 1960s. The Peace and Freedom party was founded in Venice and it was the center of the LA art scene. The design studio of Charles and Ray Eames was also in Venice. The Eames were design rock stars who IBM turned to for their World Fair pavilion and many IT and education exhibits. They even designed the look of IBM mainframes -- the Eames put the blue in "Big Blue."
I don't know if that meetup marked the founding of "Silicon Beach" (a name also claimed by Miami), but it felt like it.
|Homebrew Computer Club meeting in Palo Alto|
|SCCS Interface, published by the|
Southern California Computer Club
|First Venice tech entrepreneurs meetup|
|This building housed the Eames studio|
|Inside the Eames studio|
Posted by Larry press at Permanent link as of 12:28 PM
I recently attended several events during LA Startup Week and, even though I live in Los Angeles, I was surprised at the scope of the LA tech community. RepresentLA maintains a multi-layer, interactive map showing that greater Los Angeles is (currently) home to 1,099 self-defined Internet startups along with accelerators, incubators, co-working spaces, investors, consultants and hacker spaces that support them.
|Over 1,000 startups and organizations that support them|
The startup week featured dozens of sessions with how-to talks, demos, pitches by organizations that support startups, etc. Kevin Winston of Digital LA kicked the week off with an overview presentation on the startups, the organizations that support them with capital and services and companies like his that support the community. He said only New York City and Silicon Valley have more startup activity than LA and he made the point by listing examples of successful companies, big buyout deals and the numbers of startups and supporting organizations.
|Kevin Winston presenting an overview of the startup ecosystem.|
Winston’s company, Digital LA, organizes events and tracks the Los Angeles startup scene – kind of a digital chamber of commerce.
Represent LA lists 30 accelerators, 44 incubators and 52 co-working spaces. The difference between accelerators and incubators can blur, but both are intended to help startups get off the ground. Accelerators provide space, mentoring and capital to startups in return for equity in the company. They typically admit a cohort of companies for a limited time – perhaps ten companies for four months -- then admit a new cohort.
Incubators are similar to accelerators, but companies may come in at an earlier stage – accelerators often want a company that has a product and is generating revenue – and they are more flexible on the time the company stays in the incubator.
Note that some of these are focused on a specific industry – for example the LA Dodgers' accelerator focuses on sports related applications.
The services provided by co-working spaces also overlap with those provided by accelerators and incubators. In a co-working space, you pay rent for the facilities you need and hopefully meet people with common interests and complementary skills.
Some of the startup week events were held at the Cross Campus co-working space, shown below. Cross Campus will rent startups everything from a virtual office – just a mailing address and the right to schedule meetings or demos – to a suite of ten or more offices. A full-time dedicated desk rents for $500 per month.
|The Cross Campus co-working space|
The open space and list of amenities conveys the startup culture and values – here are some of the amenities: 7-times filtered Flowater, free craft beer, standing desks, only low VOC cleaning products used, weekly on-site massage therapist and, of-course fast Internet connectivity. (I'm a misfit – had to Google "low VOC cleaning products").
My guess is that the most valuable thing one gets out of participation in a co-working space, accelerator or incubator is meeting people with complementary interests, values and skills.
Los Angeles also has willing investors. The Represent LA site lists 72 angel and venture capital investors and Winston gave examples of a number of high-valuation investments during the last year.
There are also 17 hacker spaces open to hardware startups and 287 consultants – offering technical and business assistants. (Many of the latter also offer their services as accelerator and incubator mentors).
If you would like to keep up with the happenings in Los Angeles, subscribe to the newsletters and event listings at Digital LA and the organizations that hosted the week's events: Cross Campus, General Assembly and Expert Dojo.
I was surprised by number of startups and organizations that support and finance them in Los Angeles. The emphasis was on making money from startups – more MBA than hacker. The business emphasis reminded me of athletes who dream of playing professionally in the big leagues – I wonder how many of the startups will be around in a year.
The week also reminded me of the early PC days in Los Angeles and Silicon Valley and of a meet-up in Venice, CA that may have marked the beginning of the LA tech community, so I wrote a companion post on those times.
Mayor Eric Garcetti launched an initiative aimed at upgrading the Los Angeles Internet infrastructure soon after his election and it may pay off. AT&T has announced vague plans for fiber rollout and, more interesting, Google is considering Los Angeles for their 1 Gbps Google Fiber.
Google Fiber is an important asset, so Google can negotiate for concessions, but the city has some negotiating power too. One of Google's goals is to spur the development of unique applications that require high-speed connectivity, which the Los Angeles tech startup community and entertainment industries may very well invent. For more on Google Fiber and their consideration of Los Angeles, see this post.
Posted by Larry press at Permanent link as of 12:27 PM
Thursday, November 12, 2015
Can .co do for Colombia what .tv has done for Tuvalu?
But it wasn't really free. The name is free for the first year, but being able to manage the DNS costs $2 per month, so the real cost is $24 per year and the marketing gimmick seemed cheesy, so I checked with Hover.com, a registrar I've dealt with in the past. They charge $24.60 for the first .co year and contracts for 2-5 years are $51.20, $77.80, $104.00 and $131.00. (I wonder what the algorithm is for calculating those small, uneven increases).
So, if you are willing to settle for .co rather than .com, you can save quite a bit.
Can .co do for Colombia what .tv has done for Tuvalu?
The island nation Tuvalu has profited from their country code domain name, .tv. In 1999, the Tuvalu government licensed .tv for $1 million per quarter with a $50 million cap within 12.5 years. They also retained 20% equity in the licensing company. Subsequently, Versign took over and, in 2012, renewed its contract to manage the .tv registry until December 31, 2021. The terms of the current agreement with Verisign were not announced.
There is money to be made on .co as well. In March 2014, Neustar paid $109 million for .CO Internet and they are trying to go beyond name registration to form a community of .co domain owners -- featuring promotional videos of .co company founders on their web site and offering coupons for discounts on products, services, conferences, etc. They also promote the .co brand -- running ads and lining up large companies for one-letter domain names. For example, Twitter uses t.co for URL shortening, so millions of people see it every day.
They also pay the Colombian government a fee. Colombia has had a difficult time the last 25-30 years with revolutionary guerrilla armies, government death squads, drug cartels, murder and kidnapping, but the Economist says Colombia is close to a historic peace agreement that will transform its prospects. Perhaps popularizing the .co domain name is a small part of that transformation.
Here is a short video promoting the notion of a community of .co companies:
Posted by Larry press at Permanent link as of 8:12 AM
Wednesday, October 07, 2015
How about a Hole in Space between Havana and Miami? Between Jerusalem and Gaza City?
Cuba, one of the least connected nations in the world, has recently created 35 public-access, WiFi hotspots around the island. While 35 hotspots is a drop in the bucket, this opening is a start and it has been noted in many articles and blog posts.
|Miami Herald photo|
Cubans are living out some of their most personal moments — family reunions and introductions to new babies and spouses — not in the intimacy of their own homes but in public plazas and parks.Raúl can relax -- the people are using the Internet to communicate with loved ones, not to organize political rallies.
This reminds me of an often overlooked, pioneering project. In 1980, artists Kit Galloway and Sherrie Rabinowitz created a "Hole in Space" by connecting larger-than-life displays in New York and Los Angeles with a satellite feed. It was the mother of all video chats, demonstrating that electronic communication could convey presence and emotion.
Check out the following five-minute excerpt from a video documenting the event:
If you liked that, watch the full half-hour video:
Hole in Space was created more than a decade before we saw the first, simple version of the World Wide Web and Rabinowitz and Galloway were artists, not computer scientists. Products begin with a vision. In this case, Rabinowitz and Galloway had the vision and built the engineering prototype demonstrating its value. As the saying goes "demo or die."
Hole in Space was only one of their projects. For an overview of a quarter century of Rabinowitz and Galloway's work, see the Electronic Cafe International archival Web site.
Today, video conferencing is ubiquitous -- it has even reached Cuba -- but our video chats are on small screens. I'd love to see "Hole in Space, 2015," using today's technology. Large, public advertising displays are common and they can be linked over the Internet. Wouldn't it be cool to punch a lot of holes in space?
Where would you put the displays? For a start, how about one between Gaza City and Jerusalem or between Havana and Miami? Kickstarter anyone?
Thursday, October 01, 2015
I have been experimenting with supplementary presentations in my undergraduate Internet literacy course this term. The presenations are generally triggered by a current event that is relevant to our class. They consist of a handfull of annotated slides, most of which have an image and just few words and questions, so I expect the students to study the notes accompanying them after I present them in class.
I've presented these so far this semester:
- Evolution of a product from vision to ubiquity, using WiFi as an example
- Study of MOOC benefits for career and education
- Scientific collaboration and taxonomy versus search
- UCLA survey of first-time freshmen
- Ad blocking
- Examples of Internet democracy applications
Here are a couple random slides to illustrate the presentation style:
Monday, September 14, 2015
The UCLA Higher Education Research Institute conducts annual surveys of incoming college freshmen. There are many interesting questions, but since I teach an undergraduate course on Internet literacy, and education applications and services are increasingly important, I focus on two questions in particular:
- Have you used an online instructional website (e.g., Khan Academy, Coursera) as assigned for a class?
- Have you used an online instructional website (e.g., Khan Academy, Coursera) to learn something on your own?
Three things strike me in looking over these results:
- Students are not waiting for their schools -- they are taking online classes on their own.
- Students attending historically black colleges are doing more online study than others.
- Online classes are growing in popularity among students working on their own and as assigned work.
(I also wrote a post after last year's freshman survey).
|My grandson took a break from Minecraft to complete|
Khan Academy algebra 2 before starting high school.
Wednesday, September 02, 2015
I don't know if that is the case, but I frequently post news before it turns up in Google News or as a Google Alert. (The two systems are separate).
Here's an example:
On August 27th, at 2 AM PDT, Bloomberg News published an article entitled "Cuba's Internet Dilemma: How to Emerge From the Web's Stone Age." by Indira Lakshmanan.
The article included a photo, a graph and two ads for IBM. In spite of being short, it had six bold-face section headings. I did not post anything about the article because it offered no news and had a significant misconception.
Later that day, I received a Google Alert for an article with the same title on the SunHerald site. The SunHerald version had a different photo and had cut the graph and sub-heads, but the body text was identical to the Bloomberg version. The byline credited "INDIRA A.R. LAKSHMANAN of Bloomberg News." Notice that she now had two middle initials and her name was all caps.
There was another credit at the end of the article: "Brian Womack contributed from San Francisco."
Brian did not change the body text, but he dropped the section headings and graph, changed the photo and gave Indira two middle initials and capitalized her name. Did his "contribution" take more than ten minutes?
The SunHerald version was preceded by a full-screen ad that I had to click to remove and, when the article was displayed, it had ads for liposuction, a bail bond service, a microwave oven (I had purchased one from Amazon earlier in the week), a smiling, blond investment adviser and two ads for the SunHerald publisher.
Searching on the first sentence of the article: "Julio Hernandez is a telecommunications engineer, but like almost anyone else in Cuba who wants to get on the Internet, to do so he must crouch on a dusty street corner with his laptop, inhaling car exhaust and enduring sweltering heat", turned up many more hits, but a lot of those were snippets with links to a full-text version.
I searched Google Alerts on the key phrase "Cuba Internet" and turned up links to six full-text copies of the story: 1, 2, 3, 4, 5 and 6
I sent email queries asking about the criteria for inclusion in Google News to email@example.com and Stacie Chan, Media Outreach Manager, Google News, but neither replied. I also emailed the Bloomberg press office asking if they had licensed the article to the SunHerald and others, but received no reply.
This experience leads me to think that:
- Google's News and Alerts algorithms find stories on a topic, not necessarily news or novel analysis.
- Google's News and Alerts algorithms fail to detect redundancy, which may be intentional because it increases their ad revenue. (Might they discriminate in favor of sites with ads)?
- Google's algorithms seem to pay attention to sub-headings and images, but not body text, in screening for redundancy.
- Snippet posts often link to derivative copies rather than the original post (by Bloomberg in this case).
- Relatively small, focused, long-tail blogs and news sites are not likely to be seen by Google News or Alerts.
I hope not. Perhaps Google (or Facebook) will be clever enough to automate discovery of worthwhile long-tail news sites or use human curators to find them.
Disclosure: I have given permission for copies of blog posts to be posted by others (at no cost).
As mentioned above, I sent email queries to Google when I got the idea for this post, but did not hear from them before I published it. This morning I got an email from Stacie Chan. Here is what she said:
There aren't any minimum number of clicks needed to get accepted as a News site into Google News. We do, however, have strict quality and technical guidelines that sites must follow to get and accepted and maintain their status in Google News. We accept smaller blogs/sites as well as larger ones.(I corrected what appeared to be two cell-phone typos).
Google Alerts, as you mentioned, is a separate product from Google News. But many of their sheets are triggered by a new article from publishers on our database.
Hope that helps!
I follow the keyword "ETECSA" on Google Alerts and Google recently alerted me to a post on Cuban plans for home Internet connectivity at Frogoff.com. (ETECSA is Cuba's state-run Internet service provider).
The post was an identical copy -- text, title and images -- to my earlier post.
I don't really care that the slimeballs running Frogoff.com copied my post, but I do care that Google Alerts linked to it, not mine. Google will not tell me how they decide which sites to include in News and Alerts, but, whatever it is, it rewards parasites like Frogoff.com and overlooks relatively small, specialized long-tail sites that cover a given topic. (In this case, the state of the Cuban Internet).
I received the following Google Alert yesterday:
ETECSA is Cuba's government monopoly telecommunication company. The message alerts me to the fact that ETECSA upgraded their cell phone network in 1999 -- not exactly "news."
The alert links to ETECSA's Wikipedia page. Evidently, Google sends an alert to any change in a page that contains the alert keywords. The alert was not caused by the cell network upgrade, but that sentence contained the term "ETECSA." In fact, the most recent change to the ETECSA page in Wikipedia was made on August 12, so the non-news alert is two months old.
Here is another type of Google Alert failure:
This appears to be a link to a post on the BN Americas news site about a service outage at Cuban email provider ETECSA, but it is not. Instead, it is a link to a post on a spam site called WN.com.
As you see below, the WN page has one sentence on the ETECSA outage, hidden in a sea of spam links and illustrated by a couple dancing on the beach. Can you find the link to the BN Americas article?
Can't Google come up with an algorithm or black list to filter out this sort of "news?"
I have a longstanding interest in satellite Internet connectivity for developing nations, so have been watching SpaceX's attempts at recovering booster rockets. They tried for the third time to recover a rocket on a drone barge at sea on January 17, but failed. I watched the launch webcast and posted a note on the failure the following morning.
Since I am interested in the topic, I have subscribed to "Musk-Satellite" Goggle Alerts, and at 10:08 Sunday morning, I received one on a Washington Post article entitled "Elon Musk's SpaceX to attempt another rocket landing. This time with a twist."
Judging by the title, the article had been written before it was known that the attempt had failed, but it now says the landing failed and includes several tweets time stamped after 10:08 AM. Evidently, the story was revised after it was first posted.
But, that was just the first Google alert -- I have received 158 more since then, many of which link to two or three articles on the failed recapture. I've looked at several of these articles -- they provide no additional analysis.
They are slowing down now -- I have only received two so far today. By the time these Alerts stop, I will have received links to hundreds of redundant articles, and thousands of ad impressions.
Saturday, August 22, 2015
During the second quarter of 2015, Google reported selling $16.023 billion worth of advertising -- 11% more than the second quarter of 2014. Advertising is their bread and butter, but "other sales" grew by 17% to $1.704 billion.
I don't know how "other sales" breaks down, but a chunk of that is hardware devices like the Pixel Chromebook, Chromecast, Next thermostat, Nexus phone and, now, WiFi routers.
Google's first WiFi router, the OnHub, is being built by Chinese manufacturer TP-Link and ASUS will have one later this year.
With the exception of the Chromecast, Google seems to be going for high end devices, but does the world need another $200 home router? Why would Google bother? I can think of a couple of strategic reasons.
For one, Google has an eye on the home automation market -- controlling things like their Next thermostat and your TV set. The OnHub has the potential to become a network-connected home automation hub -- like the Amazon Echo -- which sits in your living room and listens for commands like "turn on the bedroom lights" and "lock the front door."
The OnHub lacks the Echo's microphone, but maybe the ASUS device or OnHub v2 will take care of that.
The second strategic advantage is that, since the device is online, Google will be able to dynamically tune it for maximum performance. Google will be able to monitor your network and change GoHub parameters and firmware. You will also be able to control the network more effectively, for example, giving a streaming video in the den priority over email in your home office.
Being able to improve your WiFi performance is a nice feature and, the more time you spend online and the less time you spend waiting for content, the more ads Google will be able to show you.
There are many comments on this post on the Slashdot Web site. Some worry about Google's ulterior motives:
They want to be able to mine your data at the lowest possible level, have a handy backdoor available in case the NSA comes calling, and so they can insert their own ads on every page of every website you ever browse.These worries are contradicted by this comment:
There is no way in hell you should be trusting a Google which has remote access to your network, home automation, doors and every other thing Google thinks they're going to sell you.
They want to control your network. They want to inject advertising into everything you do. They want you to have no choice but to use DNS servers they control.
Trying to inject advertising into your internet stream would be a ham-handed approach the idiots at Lenovo would try. Google is more clever than to slit their own device's throat with something so stupid as that. (Note that Lenovo was caught injecting ads and I suggested that selling more ads would be a nice side-effect of improving speed, but agree that Google would not inject ads on their own).
I know a couple of people who were involved in the development of OnHub and, FWIW, they say that the motivation was that there's a need for a Wifi router that performs better and is more secure. Not a strategic bet, just a perceived market opportunity which they thought Google was well-equipped to fill.Check Slashdot out -- there are many more comments.
With regard to performance, the antenna design of the OnHub is supposed to be dramatically better than anything else on the market, and the device incorporates ideas from the Software Defined Networking stacks Google developed internally for its data centers, to optimize data flow. I wouldn't have thought there was much you could do to make Wifi work better, since the ISP connection is generally the bottleneck, but apparently there is. With respect to security, it adopts a number of ideas from ChromeOS, plus fully-automated updates. Probably the biggest security benefit compared to the competition is that security is actually a primary design goal, which isn't the impression I get from makers of home routers.
We'll see if OnHub actually is enough better than the competition to justify its premium price. Based on what I know of the people working on it I expect that it will. I ordered one.
The FCC is considering new rules that would ban WiFi firmware modification. I guess the FCC is worried about hacking or perhaps increasing power beyond legal limits, but, if passed, the regulations would limit the ability of companies like Google and Amazon to upgrade their Internet-connected home hubs.
The FCC is open for comments on this proposal through early morning, September 8.
Tuesday, June 23, 2015
OneWeb and SpaceX are using different technologies and have different organizational strategies.
Two companies, OneWeb and SpaceX, are in competition to offer global Internet access and backhaul over constellations of low-earth orbit satellites. SpaceX recently announced that they were ready to begin limited testing and OneWeb CEO Greg Wyler gave a progress report last March.
At the same time, OneWeb announced that five companies had bid on the contract to build their satellites. Airbus has won the contract. They will build 900 150-kilogram satellites, 648 of which will be used in OneWeb's initial, near-polar orbit constellation, as shown in this animated video:
The Airbus announcement raises a lot of questions, that are addressed in a terrific in-depth interview of
Brian Holz, OneWeb’s Director of Space Systems. Holz talks about the reasons for producing the satellites in the US and the factors in choosing a factory location, the cost of the satellites ($4-500,000 each), the need to have global participation in a global project, launch services, satellite reliability and plans for eventually deorbiting them, financing and the business case, the search for manufacturers of millions of user terminals and antennas, etc.
|Brian Holz, OneWeb’s Director of Space Systems|
Holz and CEO Greg Wyler have experience -- they were together at O3b Networks, which is already delivering industrial-scale connectivity using medium-orbit satellites -- and are worthy competitors for Elon Musk's SpaceX effort. While seeking the same goal -- global connectivity -- OneWeb and SpaceX are using different technologies and have different organizational strategies.
SpaceX plans to have around 4,000 smaller, cheaper, shorter-lived satellites orbiting at only 645 kilometers. SpaceX will also keep more of the project in-house than OneWeb. OneWeb is becoming a coordinated coalition of partners. Virgin Galactic, Qualcomm, Honeywell Aerospace and Rockwell Collins are already on board and Airbus will not be a mere supplier, but a partner in a joint venture, which will include others for finance and marketing as well as technology.
Both projects are extremely ambitious, expensive and risky. I don't know which, if either, will "win," but the best possible outcome would be for both to succeed and compete with each other and with terrestrial ISPs.
I worry about the problems of capitalism with its massive concentration of power and income inequality, but this is an example of capitalism at its best.
OneWeb founder Greg Wyler announced that they have received $500 million in funding from a group that includes Airbus and is seeking to raise that much or more in their next round of funding. Previously announced partners Qualcomm (communication technology) and Virgin Glactic (launch services) are also investors.
Their plan will require an estimated $2-2.5 billion, and it does not seem they will have trouble raising it. I'd like a few shares myself :-).
|OneWeb founder Greg Wyler|
Wednesday, June 03, 2015
We have been following the plans of Elon Musk and Greg Wyler to launch constellations of low-earth orbit satellites to provide global Internet service and fast long-distance links. Neither company plans to be in operation for several years, but Musk's SpaceX is ready to test two satellites.
SpaceX has filed an application to launch two test satellites and satellite experts have been discussing it on Reddit.
The application calls for launching two identical Ku-band downlink satellites (cubesats?), which will orbit at 625 kilometers and have an expected lifetime of 6-12 months. The objective of the launch is:
To validate the design of a broadband antenna communications platform (primary payload) that will lead to the final LEO constellation design using three broadband array test ground stations positioned along the western coast of the US.They will do broadband array testing using a network of three broadband test ground locations at SpaceX Headquarters in Hawthorne, California, Tesla Motors Headquarters in Fremont, California and SpaceX Washington in Redmond, Washington. (It pays to own multiple companies). Two types of ground terminal will be evaluated at each location.
These test results will lead to a revised, perhaps final version of the satellites and ground stations. I've no idea how soon they might be ready for operation, but SpaceX is first out of the gate in the satellite constellation Internet service race.
Elon Musk wants to go slowly on the SpaceX satellite Internet project:
Many companies have tried and failed -- we want to be completely sure of success and not overestimate our strengthOneWeb is pushing ahead with Aireanspace planning at least 21 launches between 2017 and 2019.
OneWeb seems to be working with many partners -- OneWeb CEO Greg Wyler is shown below with Stéphane Israël, Chairman and CEO of Arianespace (center) and Richard Branson, founder of Virgin Galactic (right) -- while SpaceX is working on their own satellites and launch vehicles.
Saturday, May 16, 2015
|Musk's Tesla Energy talk was off the grid.|
Musk was thoughtful at an early age. While he was in college, he concluded that the three areas that would most affect the future of humanity were the Internet, sustainable energy and space exploration. He did not expect to found companies in all three areas, but went to grad school to work on energy storage for electric cars (using capacitors, not batteries). He realized that his capacitor storage might fail and decided he would rather work on the Internet than study it.
He speaks of large problems he wants to solve, not of business, profit, return on investment or stock prices -- those are means to his ends. He wants to bring Internet connectivity to sparsely populated and developing areas and provide 50% of global, long-distance connectivity (5-15 years), accelerate the advent of sustainable transport (half of our cars to be electric in 13-14 years), send people to Mars (12 years) and eliminate use of fossil fuels (a generation or two).
He is not trying to do any of this on his own -- he wants to be a catalyst. Tesla was created to accelerate the advent of sustainable transport, not to dominate the car industry, so they will not initiate patent lawsuits against anyone who, in good faith, wants to use their technology. Similarly, Musk sees their huge battery factory, Gigafactory version 1, as a product to be replicated by others -- The Tesla policy of open sourcing patents will continue for the Gigafactory and battery systems.
He is a relaxed speaker with a sense of humor. Musk and Steve Jobs are two of the best speakers I have seen, but a Jobs presentation was planned and rehearsed to perfection, while Musk seems to be speaking off the cuff. That being said, his product introductions in the following Tesla Energy talk were reminiscent of Jobs -- he used a few slides with images, not text, and spoke over them in the Jobs style. He even included a Jobs-like surprise. He revealed that the auditorium and talk were powered by his batteries, which had been charged using solar power (image below) -- like Job's "one more thing" moments or the "three new products" that turned out to be the iPhone.
If you would like to get to know Elon Musk and how I came to admire him, I recommend the following videos. (I have my students watch them).
Sal Khan of the Khan Academy interviewing Elon Musk
April 2013, 48:41, 502,483 views
This conversation gives insight into Musk’s goals and his motivation for investing in Tesla and SpaceX and you get know Sal Khan as well.
Recruiting engineers for the SpaceX satellite Internet access project
January 2015, 25:53, 23,348 views
In this talk Musk describes his plan to create a constellation of satellites to provide fast, global Internet service.
Announcement of Tesla Energy
May 2015, 18:02, 2,112,997 views
In this presentation, Musk announces products -- integrated, open-source battery systems for consumers, enterprises and utilities and the open source Gigafactory to manufacture them.
If you'd like to see more, there is a YouTube channel that claims to have links to every Elon Musk video and you can read an excerpt from a forthcoming biography of Musk here. It's a long post that traces events from his early desire to grow plants on Mars in order to stimulate interest in space through the near-bankruptcy of Tesla and/or SpaceX, which was averted at the last minute by winning a NASA contract. You can see a video (4:53) of an interview of the author, Bloomberg's Ashlee Vance, here.
This post looks at "the bad behavior of visionary leaders" and concludes that leaders like Musk, Jobs and Bezos could be more effective it they behaved better:
The question raised by the stories of these three men is not whether being tough, harsh and relentlessly demanding gets people to work better. Of course it doesn’t, and certainly not sustainably. Can anyone truly doubt that people are productive in workplaces that help them to be healthier and happier?I've personally been chewed out by the young Bill Gates and wonder whether he should not be added to this list.
The more apt question is how much more these men could have enhanced thousands of people’s lives – and perhaps made them even more successful — if they had invested as much in taking care of them as they did in conceiving great products.
Tuesday, May 12, 2015
Monday, April 20, 2015
LinkedIn acquires Lynda.com and hopes to "create economic opportunity for every member of the global workforce"
Early MOOCs and Internet-based classes focused on traditional university courses, but there has been a shift of emphasis toward vocational traning and lifelong learning.
Lynda.com has focused on vocational training and lifelong learning since its begining in 1995. They have developed over 2,900 video courses in English, German, French, Spanish, and Japanese and have 4 million subscribers in 150 countries.
LindedIn has 350 million users who are looking for career advancement and job opportunities. (It's currently the 14th most visited Web site on the Internet).
LinkedIn has acquired Lynda.com and one can imagine the combined company pointing users to specific courses that would help them move up in their current positions or find better jobs.
That seems to be the basic idea and they say they want to do it on an ambitious global scale. Ryan Roslansky, Head of Content at LinkedIn, says the vision of the merged company is to "create economic opportunity for every member of the global workforce" and their goal is to "lift and transform the global ecomomy."
(That sounds like something you might hear in a VC pitch in the comedy TV series "Silicon Valley," but let's suspend judgement).
They hope to create an "economic graph" -- compling databases with profiles of every member of the global workforce (what they have studied and what their skills are) and the available jobs and skills required to obtain those jobs at every every company in the world. Those databases plus an inventory of courses offered by every higher education organization and university will let people find the training they need to get a specific job and let employers find the people who are qualified to do a particular job.
My first reaction is that establishing standards and definitions that would enable them to come close to that vision across a variety of industries, cultures and languages is impossible, but they might be able to create economic graphs for specific industries and countries.
I worked as a consultant to Hyundai some time ago, and the Human Resources department had a system for tracking employee skills, job skill requirements and available training classes. There are also human resources software packages like Trackstar for such systems. Perhaps LinkedIn will take a bottom-up approach, replicating this sort of system industry by industry.
A couple of years ago, I wrote a post asking if there was a place for Lynda.com and other online training companies in the MOOC discussion. It's become clear that the answer is "yes" and with this acquisition, LinkedIn will be a prominent player and competitor to companies like Udacity and Coursera. To the extent that company hiring practices and societal certification change, they will also be an alternative to universities for students whose promary goal is getting a good job.
Here is a video of LinkedIn CEO Jeff Weiner describing the Economic Graph and their vision for the next ten years:
Saturday, April 11, 2015
Microsoft was founded in April 1975, when the personal computing hobby was just beginning, and the Economist has an article on the company evolution to "middle age."
Microsoft began with development tools -- a BASIC interpreter and Pascal and Fortran compilers -- but soon moved on to Windows and later Office. Under Bill Gates, and later Steve Balmer, the company strategy was to "strengthen Windows, to make it ever more crushingly dominant." That strategy worked well during the desktop/laptop/on-premises server era, but it constrained Microsoft -- keeping them from purusing new opportunities on the Internet and mobile devices.
Current CEO Satya Nadella, shown below with Gates and Balmer, has a different strategy -- "just build stuff that people like."
That has led to the porting of Office to other operating systems and the Internet, support of open source and emphasis on their Internet platform, Azure.
The article constrasts Microsoft's middle age slump with Apple (founded in April 1976), which has passed them in profit:
and now accounts for a much larger percent of the US technology sector than Microsoft:
The Economist article is on Microsoft, but the fall from dominance of IBM, as illustrated in the above graph, is even more striking. IBM totally dominated the (smaller) technology market until a disruptive startup, Microsoft, led the revolution that toppled them.
Friday, April 10, 2015
In 1965, Intel co-founder Gordon Moore wrote an article called "Cramming more components onto integrated circuits."
In the article he said
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000.Note that he is not talking about what would be the largest theoretically possible chips, but about what would be cost effective.
Moore's prediction was based upon extrapolation of the history of the integrated circuits up to that time:
Note that he is predicting exponential growth -- growth at a constant percentage rate.
He does not use the term "Moore's Law" in the article, but the term/meme caught on and we are still using it to describe exponential growth of all things techie -- storage and memory density and speed, communication speed, etc.
Moore's projection held up well beyond 10 years. In this plot of the number of transistors on commercial CPU chips through 2011, the line represents doubling every two years:
The accuracy of his projection is all the more remarkable when you realize that the prediction was made six years before Intel's first CPU chip, the 4004, which had 2,300 transistors.
At some point, density increases will level off, but that point has not yet been reached. Apple's 8X system on a chip that is inside your iPad Air has 3 billion transistors.
The first electromechanical compputers used electromagnetic relays as switching elements. Folloiwing genertions moved to vacuumn tubes, transistors and today's integrated circuits. When Moore's law finaly hits the wall, will we move to another switching technology and continue improvement?
You can check out some cool Moore's Law infographics here.
A Re/code article says Moore's law is 50, but may not reach 60. The article quotes Intel executive Tracy Smith as saying they expect to be making chips with 5 nanometer features (about twice the size of a strand of DNA) around 2022, but that will be the end of the line.
The article goes on to speculate on what technology might come next -- the red question mark in the above figure -- but makes no predicitions. It also includes the following video (1m 53s) of Gordon Moore reflecting back on his 1965 article and the term "Moore's law," coined by semiconductor pioneer Carver Mead.
Wednesday, March 18, 2015
Think about the possibility of a WiFi network with a low-latency, 50 mbps back-haul link to the Internet in every school or rural clinic in the world.
I've been tracking Greg Wyler and Elon Musk's satellite Internet projects for some time. Both have been relatively quiet (most of what I know of Musk's SpaceX project came from an unauthorized cell phone video of a recruiting talk he gave), but Wyler talked about his company OneWeb in a keynote at the Satellite 2015 Conference yesterday.
Wyler plans a constellation of about 650 satellites in low-earth orbit (about 1,200 kilometers). He said that they plan to launch satellites in 2017 and hope to begin offering service in 2019. (It seems that OneWeb is ahead of the SpaceX schedule).
They will offer 50 mbps, 30 ms latency connectivity to $250 ground stations that will also serve as hot-spots, providing WiFi, LTE, 3G or 2G connectivity.
As shown below, a terrestrial route between Los Angeles and the tip of Chile requires 14 hops. The same route via satellite may require only five low-latency hops. (The figure is drawn to scale).
Think about the possibility of a WiFi network with a low-latency, 50 mbps back-haul link to the Internet in every school or rural clinic in the world.
Wyler showed a prototype of one of his ground-stations and also showed how easy it is to set up. The operator just spreads the solar panels and turns it on -- five seconds install time. Here we see one on the corrugated roof of a building:
This ease of deployment would be terrific for establishing ad hoc communication in the wake of disasters that had disrupted terrestrial communication.
While I have focused on OneWeb's primary goal of providing Internet connectivity in developing nations and rural areas, Wyler also spoke of providing connectivity in aircraft (and ships at sea).
Of course, all of this is speculation for now. Some conference attendees and presenters were skeptical about Wyler's project, pointing out that his low-cost satellites would have to be replaced every five years or so -- a recurring expense. Critics also pointed out that much of the time, the low-earth orbit satellites will be over oceans, polar regions and other sparsely populated areas.
That being said, Wyler has been able to attract backers and partners, each of which brings money and expertise to the table:
- Virgin Galactic: launch services
- Qualcomm: radio hardware
- Honeywell Aerospace: aircraft equipment and airtime services
- Rockwell Collins: satellite communication terminals for aircraft
One can also imagine OneWeb providing competition for conventional terrestrial ISPs in developed nations. I can dream of going over to Best Buy, picking up a OneWeb ground station, installing it on my roof and escaping the clutches of my ISP monopolist Time Warner Cable. I am not holding my breath till that happens, but I will be keeping my eye on OneWeb's ambitious project.
For some background on Wyler's previous satellite company, O3B Networks, and more on his plans for OneWeb, check out this video:
FierceWirelessTech interview of Greg Wyler.
Wyler says "We've got a pretty clear path. It's not just a technology problem. It is a technology, regulatory, implementation, education problem. It's kind of a little bit of everything." In the interview, he talks about terminal design, their business model and spectrum.
As mentioned above, he stresses ease of installation and low cost for the terminals. OneWeb has the rights to the Ku and Ka spectrum they will use and patent-pending technology to assure non-interference with geo-stationary satellites in those bands. Scale is critical to their business model -- once the constellation is operating, they marginal cost of a new customer is very low.
Friday, March 13, 2015
5G mobile communication is coming and prototypes are being developed along with demonstrations. (Remember the saying that projects had to "demo or die")?
Here are a couple of 5G prototype demos:
Samsung transmission speed demo -- 7.5 gbps standing still and a 1.2 gbps in a car going 112 kph:
Erricson demonstrates seamless hand-off between LTE and 5G:
and they brought their virtual reality, remote control excavator with them to MWC:
The final products will probably not be as fast as these prototypes, but they will eventually cost about the same as today's mobile radios.
Both Samsung and Ericsson are talking about initial deployment around 2020, but general rollout and ubiquitous adoption will take many years after that. (This is one technology in which developing nations, which are generally more mobile reliant than developed nations, may somewhat narrow the digital divide). Furthermore, there are no 5G standards, and you can bet there will be more than one.
Wouldn't it be nice if there were a global 5G standard -- everyone using the same license free spectrum and protocols -- your phone moving seamlessly between nations and carriers -- cars that were compatible with instrumented roads everywhere ... like WiFi ...?
Awake again -- Maybe I will get a Verizon 5G phone for use in the US around 2022.
By that time, out mobile devices will be 10-20 times as powerful and there will be a lot of "things" connected to the Internet. What new applications will we find for this high-speed, low-latency wireless connectivity?