Monday, November 24, 2014

RAND Corporation's contributions to computer science

What comes to mind when you hear the word Rand? Ayn Rand? Rand Paul? For me, it is the RAND Corporation. Project RAND (research and development) was housed at Douglas Aircraft in Santa Monica, California immediately after World War II, and became an independent, nonprofit organization in 1948. Perhaps the first "think tank," they spun off their development work, creating the the System Development Corporation (SDC) in 1957.

I don't know how one ranks research institutions, but, for me, RAND ranks right up there with Bell Labs, IBM Research and newcomers Microsoft and Google research. The following are summaries of a some of the computer science advances made by RAND researchers and consultants.

Communication satellites: Science fiction writer Arthur C. Clark outlined the vision of geostationary communication satellites in a short article published in October, 1945. Five months later, Frank Collbohm and James Lipp published a comprehensive engineering study on a "Preliminary Design of an Experimental World-Circling Spaceship."

Arthur C. Clarke's vision (left) and RAND's design 

Artificial intelligence: Herbert Simon, Allen Newell and Cliff Shaw did early work on artificial intelligence at RAND and Carnegie Tech. They asked people to talk out loud while proving theorems and noted that their strategy was to apply operations that reduced the differences between the current state of the proof and the theorem they were trying to prove. Their programs, Logic Theorist and General Problem Solver, did the same -- and so does this pigeon:

A pigeon solves Wolfgang Kohler's box-and-banana
problem by applying the Box Move operator.
Operations research: George Dantzig and Richard Bellman invented mathematical techniques for finding optimal or near-optimal solutions to complex, but well defined problems. This work has applications in network design and you use Dantzig's simplex algorithm whenever you build an Excel spreadsheet to solve a linear programming problem.

SIMSCRIPT: Harry Markowitz and Bernard Hausner invented the SIMSCRIPT programming language for simulating systems like customers moving through checkout stands at a market. SIMSCRIPT was an early object-oriented language in that it modeled the world as sets of entities and their attributes. Entities could be created and destroyed and their attribute values and set memberships changed when events in simulated time occurred. (SIMSCRIPT is close to my heart because it was the subject of the first class I ever taught. Unfortunately, my wife threw out my SIMSCRIPT t-shirt years ago).

T-shirt -- Entities, Attributes and Sets

An early research computer: Early computers were built as research projects at universities. You can recognize them by their names ending in "AC" for automatic calculator. RAND's JOHNNIAC (named in honor of mathematician and computer architect John von Neumann) was a stored program computer. It was used for applications including the early artificial intelligence research and operations research mentioned above.


Timesharing: Terminals and drum storage were added to the JOHNNIAC, enabling Cliff Shaw to create JOSS (JOHNNIAC Open Shop System), one of the first interactive time sharing systems. JOSS was "open shop" in that users interacted directly with the computer rather than dropping off jobs to be run at a later time by a computer operator. I was fortunate to see demonstrations of both JOSS and QUICKTRAN, an interactive FORTRAN interpreter built by John Morrisey of IBM. The advantage of these systems over batch processing was immediately and strikingly apparent. This led me to SDC (mentioned above), which by then had a more advanced time sharing system that I used for my dissertation research on man-machine data analysis.

Programmer at a JOSS terminal
The RAND tablet: The RAND Tablet, the great grandfather of the iPad, was built by Tom Ellis and his colleagues. Their GRAIL (graphical input language) software featured object-oriented drawing and character recognition. Their publications are some of the earliest work on human-computer interaction -- GRAIL was the great grandfather of Macdraw. (The following video clip is narrated by Alan Kay of Dynabook fame).

Object-oriented drawing and character recognition on the RAND tablet

Large, packet-switching networks: I've saved the best for last. Paul Baran published a series of eleven reports on Distributed Communications Networks in 1964. In Volume 2, he described the network architecture:

Traffic to be transmitted is first chopped into small blocks, called Message Blocks or simply messages. These messages are then relayed from station to station through the network with each station acting as a small "post office" connected to adjacent "post offices."
After simulating this system and considering the technology of the day, Baran concluded in Volume 11 that
It appears theoretically possible to build large networks able to withstand heavy damage whether caused by unreliability of components or by enemy attack.
He was right!
Paul Baran's distributed network arhitecture

In the following video, Baran reviews RAND's work on distributed networks and packet switching -- from early concern over the possibility of a nuclear attack disrupting military communications through skepticism about packet switching and the creation of the ARPANet. (Since the ARPANet was just a research project, they did not need to bother with security and encryption).

Paul Baran on distributed networks and packet switching (38 minutes plus Q&A)

Two of the people cited here went on to win Nobel prizes. Harry Markowitz received the Nobel Prize in economics for his work on portfolio theory -- perhaps not tied to his work on digital simulation.

Herbert Simon also received the Nobel Prize in economics for his work on decision making. He noted that we do not make optimal decisions when choosing among alternatives because information about outcomes is incomplete, gathering more information has a cost and outcomes are multidimensional. In real life we make satisfatory decisions. This realization no doubt guided his studies of the thought processes of chess players and theorem provers and therefore his work on artificial intelligence.

(A personal note: I took a class from professor Simon as an undergrad. All I recall was that I liked him a lot and he told us about the chess game his "computer" -- whatever that was -- was playing with a computer in Arizona. I also met him much later, and he was modest and helpful -- told me he stored most of what he knew in his friend's heads).

You can learn more about any of this work on Wikipedia or using Google, but -- better yet -- download the historic reports by these researchers from the RAND Web site.

Update 11/29/2014

As noted above, RAND spun off its development work in 1957 when SDC was set up to build the SAGE (Semi-Automatic Ground Environment) air defence system, designed to defend the U.S. against nuclear attack.

SAGE was the first computer network and the project trained most of the computer programmers in the US at the time. The project also produced many innovations in programming and programming project management.

After SAGE, SDC built an advanced general purpose time-sharing and software development system on an AN/FSQ-32 (Army Navy Fixed Special eQuipment) computer built by IBM. The Q-32 was used for ARPA-sponsored research projects in man-machine interaction -- including my dissertation project. More on SDC in a forthcoming post.

The AN/FSQ-32 supported research on man-machine systems.

Monday, November 17, 2014

18F is doing e-government and gaining traction

18F: Open source and transparent processes -- who says government has to be old fashioned, slow and inefficient?

In an earlier post, I described USDS and 18F, new government agencies that are intended to improve US e-government in the wake of the HealthCare.Gov debacle. USDS is a management consulting firm for federal agencies that favors lean startup methods, open source and agile development by small teams. 18F complements USDS -- they build tools and implement government systems.

You can check 18F's open source projects at the "alpha" version of their project dashboard. As shown here, they currently have twelve projects in various stages of development.

Scrolling down, one sees the entries for each of the 12 current projects. For example, they are building a portal for submitting and searching for Freedom of Information Act requests for the Justice Department. (Note that the department is a partner not a client).

The project descriptions have links to pages where you can see and contribute to the code, discuss the project with the developers and the public, and read a news release describing the project.

18F is not unique. The UK Government Digital Service has the goal of "transforming government services to make them more efficient and effective for users." They were formed several years ago in response to dissatisfaction with the British Health System Web site. You can learn more in this NPR story.

18F and the UK Government Digital Service have something very important in common -- they are staffed by skilled experts who could be making more money in the private sector but have elected (perhaps temporary) government service. I saw the same thing in a study of the Internet in Singapore where the "best and the brightest," went to government service.

How great would it be if all of government were staffed by the same sort of people?

Update 4/26/2015

The Defence Department and Homeland Security hope to attract tech talent from Silicon Valley and elsewhere to help with security and other applications. This initiative "stems directly from the President," who has turned to modern technology and methods in political campaigns, debugging the site when it was in trouble and more. The pitch is to "take your skills and work for team America."

Thursday, November 13, 2014

Elon Musk and Greg Wyler's plans for global satellite connectivty

Routers in space

(See the related post on cost savings from re-use of rockets used in launching satellites).

In the early 1990s, cellular pioneer Craig McCaw, Microsoft co-founder Bill Gates and Saudi Prince Alwaleed bin Talal founded Teledesic, with the intention of providing global Internet connectivity using low-earth orbit satellites. The satellite and launch technology were not good enough and the company failed.

Teledesic animation showing a satellite
constellation that would cover the planet.

But satellite and launch technology have come a long way since that time. In an earlier post, I asked whether Google could connect the "other three billion" in developing nations and rural areas. The post surveyed Google projects involving high altitude platforms like blimps, drones or balloons that hover or circulate in the stratosphere, low-earth orbit satellites used for imaging and telephony and medium-earth orbit satellites used for communications and navigation.

One of those projects was a collaboration with O3b (other three billion), a company founded by ex-Google executive Greg Wyler. O3b began with four satellites in 8,000-kilometer equatorial orbits and planned to serve all parts of the Earth within 45 degrees of the Equator. Wyler has left O3b when he went to Google and O3b and now has 12 satellites in orbit.

In describing the O3b project, I wondered "whether they are considering a low-earth orbit constellation" and it seems they were. Mr. Wyler subsequently left O3b to found WorldVu, which planned a constellation of 300 satellites at between 800 and 950 kilometers in altitude and has acquired Ku-band spectrum. Service will be marketed under the OneWeb brand.

That takes care of the improved satellite technology, but how about launch technology?

The Wall Street Journal recently published an article saying that Mr. Wyler would be teaming up with Elon Musk, founder of SpaceX to provide global Internet access using a constellation of 700 satellites, each weighing less than 250 pounds. Musk confirmed the plan in a couple of Twitter posts, but also criticized the Wall Street Journal reporting.

I hope they will be able to realize Teledesic's 1990 vision using 2020 technology.

I concluded my earlier post on this topic by "wondering whether Jeff Bezos, founder of Blue Origin, Elon Musk, founder of SpaceX, and Richard Branson, founder of Virgin Galactic are eyeing those other three billion people." I still wonder about Bezos and Branson.

Update 11/13/2014

After writing this post, I attended a session at Rand Corporation's Politics Aside conference and had a chance to ask Simonetta Di Pippo, the Director of the UN Office for Outer Space Affairs, about her take on this proposal. She did not give a direct answer, but said that Elon Musk is a very smart man and he has never failed to succeed at anything he committed to do.

A SpaceX executive overheard my question and said he could not comment, but he reiterated Elon Musk's tweeted statement that the Wall Street Journal article had errors and we would have to wait a couple of months for the full announcement of their plans.

I guess we will have to wait to see, but this could be a Big Deal.

Update 12/15/2014

SpaceX will carry micro-satellites made by Planet Labs to the International Space Station (ISS) for launch into orbit.

Two Plane Labs satellites just after launch from the ISS

The satellites shown being launched are Planet Labs earth-imaging satellites. They are smaller and orbit at lower altitude than those discussed above, but might a constellation of more, smaller satellites in lower orbits and carrying routers rather than cameras be suitable for Internet communication? (That is not a rhetorical question -- I do not know).

For more on Planet Labs, check this terrific Ted Talk by Planet Labs co-founder Will Marshall:

Update 1/14/2015

The competition heats up.

Greg Wyler’s OneWeb satellite-Internet company has received funding from the Virgin Group and Qualcomm and Richard Branson of Virgin Group and Paul Jacobs from Qualcomm will have seats on the board. (If you are a Wall Street Journal subscriber, there is a longer article here).

Elon Musk also announced the opening of a Seattle office for the design of satellites. SpaceX declined to comment on Wyler's announcement.

Whoever builds the rockets, satellites and markets the service, it sounds like Teledesic is being reborn using modern technology and, if successful, it would be a major extension of the nervous system of the Earth and a significant enabler of Bill Gates' work in developing nations.

Greg Wyler wants to bring the Internet to the entire world.

Update 1/17/2015

Elon Musk announced that he plans to deploy a constellation of router equipped-satellites -- evidently in competition with the Greg Wyler's OneWeb project. Musk announced his plan at a closed meeting for potential employees of his new satellite office and state, local and federal government officials.

No details were released and those that were leaked appear to be inconsistent. For example, the Seattle Times reported that they planned to deploy 4,000 geosynchronous satellites and Business Week reported that they planned 700 low-earth orbit satellites.

This effort is not an end in itself, but part of a larger plan to reach Mars -- Musk says he wants to die on Mars.

It is terrific to see two powerful groups competing to fulfill Bill Gates' original vision of global satellite connectivity -- Teledesic. Teledesic failed, but with modern launch capability, micro-satellites and communication equipment one or both of these efforts may very well succeed. If they do, it will be an historic achievement and a significant complement to Gates' current work in developing nations.

Elon Musk will compete with OneWeb

Update 1/19/2015

When this thread began last November, it seemed like Elon Musk and Greg Wyler would collaborate on an Internet satellite venture, but now it looks more like competition.

A post on Ars Technica quotes Musk as saying “Greg and I have a fundamental disagreement about the architecture -- we want a satellite that is an order of magnitude more sophisticated than what Greg wants. I think there should be two competing systems.”

They quote Richard Branson as saying that Musk doesn't have a chance because Wyler has spectrum rights and there is not enough space for two satellite constellations. He thinks the logical thing for Musk to do is work together rather than separately.

Wyler's former employer Google, which has also been working on satellite connectivity, is said to be close to investing in Musk's Space X.

I can't wait to see where all this ends up in, say, five years.

Update 1/20/2015

SpaceX has confirmed an investment of $1 billion from Google and Fidelity for a reported 10% of the company. That leads to an evaluation of around $10 billion. (I may be old-fashioned, but I don't understand markets that evaluate WhatsApp at nearly double the value of SpaceX).

It also seems that SpaceX is considering the use of modulated laser beams to cope with OneWeb's advantage in spectrum holdings.

Regardless of the technology, OneWeb and SpaceX will have to deal with regulators in each nation they serve, which seems inefficient -- would it make more sense to establish some international regulatory rules?

Looking forward -- what if one of these companies pulls their plans off and ends up serving a billion or two billion customers -- should we worry about their power? It sounds like Comcast on steroids. Even if they both succeed and establish a duopoly, they will have immense power.

Update 1/26/2015

Business Week has published a background piece on Greg Wyler -- his biography and personality. It is interesting to read for general background, but has a few details that are new to me. He says he plans to orbit 648 satellites at an altitude of 750 miles and hopes to sell the user terminals for around $200. Since there will be several satellites within range of any point on Earth, he says their antennas will not need to be professionally installed or move to track satellites, as is the case with O3B.

The article is accompanied by a 4:33 video in which Wyler describes O3B and his plans for OneWeb -- here are a couple of stills from the video:

Wyler with a mock-up of a user terminal

Wyler illustrates the latency differences between
low, middle and geostationary orbits

Update 1/26/2015

Cell-phone video (25:53 min) of Elon Musk's talk at the closed-door announcement of the establishment of a satellite design office in Seattle. Many high-level details on the project.

Update 2/12/2015

Third time is closer, but still no cigar :-(. "Rocket soft landed in the ocean within 10m of target & nicely vertical! High probability of good droneship landing in non-stormy weather." — Elon Musk February 11, 2015

Update 3/10/2015

Excerpts from a Via Satellite interview of Greg Wyler.

VIA SATELLITE: With industry verticals being served, and O3b connecting the other 3 billion people, where does OneWeb fit in to the communications landscape?

Wyler: O3b Networks does links around 150 Mbps and up, and this is about links that are much lower speeds than that. Our primary core competency will be sub 50 Mbps to small, inexpensive terminals.

VIA SATELLITE: How difficult was it to get investors like Virgin and QualComm to buy into this vision?

Wyler: Qualcomm knows more about communications chips, handover protocols and LTE then any other company. They also have a long background in satellite having built Globalstar and many other satellite communications systems. Virgin has Richard as the leader with a strong understanding of things that you can’t imagine he would have a sense of, and then this deep bench of players.

VIA SATELLITE: I understand an RFP is already out regarding the manufacturer of these satellites? When do you hope to finalize this?

Wyler: We are building satellites at high volume. They need to be done on a production line, rather than a one-off manufacturing process. We are going into a partnership where we will own a portion of the factory and the manufacturer the other portion.

VIA SATELLITE: Is 2017 a realistic timeframe to launch the first satellites? Wyler: I am an optimist. I think 2017 is a realistic time to have our test satellites up. I am not saying the constellation will be working then.

Update 3/18/2015

A third would-be satellite ISP, Leosat, has revealed plans for a constellation of Internet satellites. They will not be marketing to individual end users, but will target government and business -- maritime applications, oil and gas exploration and productions, telecom back-haul and trunking, enterprise VSAT, etc. They (and the others) hope to be able to provide low latency links over long distances. As shown here, a route from Los Angeles to southern Chile requires only 5 satellite hops as opposed to 14 terrestrial hops.

Update 10/7/2016

We have followed SpaceX's efforts to cut satellite launch cost by soft-landing and reusing rockets. Another way to cut launch costs is to use a single launch to place multiple satellites in different orbits and the Indian Space Research Organisation (ISRO) has successfully launched eight satellites into two different orbits. If rocket reuse and multiple-orbit launches become routine, the cost of creating constellations of Internet service satellites will be significantly reduced.

Single launch places eight satellites
in two different orbits.(credit: ISRO)

Update 11/17/2016

SpaceX has submitted a 102-page technical supplement to their application for permission to launch a constellation of Internet-service satellites.

leaked SpaceX financial documents reveal that Elon Musk expects significant revenue as a satellite Internet service provider:
SpaceX eventually plans to launch 4,000 communications satellites, which would be dozens of times larger than any other constellation, with the first phase of this possibly going online as early as 2018. SpaceX anticipates that the satellite business will become more profitable than the rocket business by 2020, generating tens of billions of dollars by the mid-2020s.
Another post on the same leak was more specific, saying that "SpaceX expects to generate more than $15 billion in profit by 2025."

Update 1/18/2017

Two of the updates to this post were triggered by SpaceX filing a request to launch 4,400 Internet service satellites last November and a leak of SpaceX financial data last week. Each of those events triggered in-depth, informative discussions on Reddit. The Reddit discussion of the request to launch the satellites (here) and the Reddit discussion of SpaceX finance (here) cover launch, radio and IP technology, markets, advantages and disadvantages compared to terrestrial networks and much more. Check them out and join the discussions.

Update 3/11/2017

A 2016 patent by Mark Krebs, then at Google, now at SpaceX, has several interesting figures like this one specifying two constellations, each at a different altitude. As shown here, the lower-altitude satellites have smaller footprints, but would have lower latency times than the higher altitude satellites.

The higher-orbit satellites will be launched in the first phase of the project, enabling SpaceX to bring the Internet to underserved and rural areas of the Earth. The second phase, lower-orbit satellites, will be able to offer faster service and possible compete with terrestrial networks in urban areas. I am not sure, but being at different altitudes might also simplify multi-satellite launches -- launch half at low altitude then proceded to the higher altitude and launch the second half. The high altitude satellites might also enable fewer hops on long, over-the horizon routes (as shown above) and make for smoother satellite handoffs during a session. The following is a neat video explainer of the SpaceX plan:

Wednesday, November 12, 2014

Roku has made my Google Chromecast superfluous

Do you still need a Chromecast device?

I am a cord-cutter -- I use a Roku streaming device on my TV set and do not receive regular cable channels. (I have a trusty rabbit ears antenna for local over-the-air broadcasts, but that will not work for many people).

I also have a Google Chromecast, which I said I loved in a review a little over a year ago. But, a year later, it turns out there is nothing in the Chromecast app library that I want to see that is not also available on my Roku. In fact, I watch several Roku channels -- like PBS -- that are not currently available for the Chromecast. I guess Bill Clinton would say "it's the content, stupid."

But, I kept my Chromecast around for screencasting -- mirroring my computer or phone on the TV set -- until now.

It is now superfluous because Roku has released the beta version of Miracast screen mirroring for Windows 8.1 and selected Android devices.

As you see here, when I open the Screencast setting on my Android phone, I have two target destinations -- the Chromecast and the Roku streaming stick. (Both are connected to the same TV set).

I did an informal test of the two devices using the CBS All Access video streaming service. I watched episodes of "Big Bang Theory" using both devices, and did not notice a significant difference in quality. The video did occasional half-second stutters a few times and the audio would also drift out of synch from time to time, but the program was watchable on both the Roku and the Chromecast. I have no doubt that next generation hardware and improved video and compression algorithms will take care of those small glitches (as long as my ISP keeps the bits flowing smoothly).

Miracasting is only available on two Roku models -- the Roku 3 and the Streaming Stick -- and selected Windows 8.1 and Android devices, but no doubt wider support is coming. If you have a miracast-compatible device, you might as well unplug your Chromecast.

Update 11/20/2014

A couple days ago, I got the Android Lollipop update for my Nexus 5 phone and CBS updated their All Access application, so I decided to retest the streaming video quality. I used this as an excuse for watching another episode of Big Bang Theory and (subjectively) noted that the video had smoothed out -- there were no stutters -- but the audio had deteriorated -- it was out of synch the entire time. I don't know whether this is attributable to Lollipop or the app or a combination of the two.

I ran this test twice -- once streaming to a Roku Streaming Stick and the other to a Chromecast -- and the subjective experience was the same.

I'm disappointed by this step in the wrong direction, but I faster hardware and better algorithms will smooth out video glitches and synch the audio.

One other cosmetic change -- the Lollipop screencast screen has changed to black on white:

Monday, November 10, 2014

Google testing high-speed wireless -- the last kilometer for Google Fiber?

Google could even take the Android approach -- make the technology available to municipal governments and others and watch their advertising business grow as it is deployed.

In 2012, Goldman Sachs analyst Jason Armstrong looked at Google Fiber and estimated that it would cost them $70 billion to connect less than half of all US homes. He also estimated that it had cost Verizon $15 billion to bring FIOS fiber to 17 million homes. Armstrong concluded that he was "still bullish on cable, although not blind to the risks." (Armstrong has since left Goldman Sachs and works at Comcast and Verizon has cut back on FIOS).

That sounds grim, but what if wireless technology could significantly reduce the cost of connecting homes and offices?

Google has asked the FCC for permission to conduct tests of millimeter wave-length wireless communication for 180 days.

As shown below, short wavelength, high frequency (E-band) signals travel relatively short distances and can not pass through walls or other obstructions, but they enable gigabit and faster data transmission rates:

E-band wireless in context: The current market is dominated by a few companies selling equipment for cell phone backhaul and other point-to-point applications, but what if the smart guys at Google could figure a way to use it for neighborhood links? (Image: E-band communications.)

How much of Armstrong's $70 billion estimate would Google (or anyone else) save if they could run fiber to the block or neighborhood and reach individual homes using this radio technology?

Google Fiber started in Kansas City and today it is available in two other cities (and some surrounding areas). They are currently evaluating 34 additional cities and those cities would look a lot more attractive if they were able to use wireless links to reach homes from neighborhood poles. Google fiber could also provide backhaul for mobile communication.

If this dream materialized, Google would provide stiff competition to the incumbent phone and cable companies and drive connectivity prices down, but would that be the best solution for the public?

In the US, most of us have only one or perhaps two competing Internet service providers. Google would be a second or third, but we would still have an oligopoly and, while Google may not "do evil" today, who knows about the future?

Google, Comcast or any other ISP must deal with local government for things like access to tunnels, phone poles and utility boxes. Might we not be better off in the long run if local government owned the infrastructure regardless of the technology? This solution has worked well in Stockholm, Sweden, where the municpality owns the infrastructure and sells wholesale access to ISPs who service customers.

What will Google do if this technology works out? They could become nationwide wholesale or retail ISPs or even take the Android approach -- make the technology available to municipal governments and others and watch their advertising business grow as it is deployed.

All of this is highly speculative, but if the technology and business model work out, we may be able to get low-cost gigabit connectivity without moving to Kansas City.

Update 4/16/2016

The FCC has granted Google an experimental license for terrestrial and airborne high-frequency wireless tests. The grant is effective March 17, 2016, through April 1, 2018, and covers 71-76 GHz and 81-86 GHz frequencies.

The airborne experimentation may be for Google's Project Loon and the terrestrial experimentation may be for high-speed short-range wireless link in densely populated neighborhoods (like the street where I live :-).

Saturday, November 08, 2014

Harvard study of variance in lecture attendance

Attendance varies between courses, with the day of the week and with special events like exams and guest speakers.

Samuel Moulton, director of educational research and assessment for the Harvard Initiative for Learning and Teaching gave a presentation on their preliminary research on lecture attendance. He reported on attendance from 10 classes, and found that attendance varied depending upon day of the week:

Attendance soared on exam days and dropped the Friday before Spring break:

(Attendance can be over 100% since student drop classes after enrolling and attending the first few lectures).

He also saw that special events like an optional movie or a guest speaker had an impact on attendance:

Moulten further analyzed the data by dropping events like exam days and fitting fitting a line to the data:

He concluded by showing the plots for all of the courses in his sample:

As you see, there is significant variation in attendance.

How do these results compare to your experience? What factors contribute to attendance variation?

Watch Moulton's presentation: