TIME magazine called him
“the unsung hero behind the Internet.” CNN called him “A Father of the Internet.”
President Bill Clinton called him “one of the great minds of the Information
Age.” He has been voted history’s greatest scientist
of African descent. He is Philip Emeagwali.
He is coming to Trinidad and Tobago to launch the 2008 Kwame Ture lecture series
on Sunday June 8 at the JFK [John F. Kennedy] auditorium
UWI [The University of the West Indies] Saint Augustine 5 p.m.
The Emancipation Support Committee invites you to come and hear this inspirational
mind address the theme:
“Crossing New Frontiers to Conquer Today’s Challenges.”
This lecture is one you cannot afford to miss. Admission is free.
So be there on Sunday June 8 5 p.m.
at the JFK auditorium UWI St. Augustine. [Wild applause and cheering for 22 seconds] The June 14, 1976 issue
of Computer World magazine reported on a special session
on parallel processing that occurred at the
National Computer Conference. At that conference session,
respected leaders of thought in supercomputing
mocked parallel processing research as a [quote unquote] “waste of time.”
That four-day National Computer Conference was held in New York City,
from June 7 to 10, 1976. The Computer World magazine reported that
a panelist of supercomputer experts at that National Computer Conference
were of the opinion that [I quote]:
“Those machines often turn out to be large and clumsy,
and several of the large parallel processor designs
since then have failed. Now we are moving into the modern era.”
[End of quote] Supercomputer scientists were reading articles,
such as the one in the June 14, 1976 issue of the magazine Computer World
that was titled: “Research in Parallel processing
Questioned as ‘Waste of Time.’” In the 1970s and ‘80s,
the Computer World magazine was as eagerly read
and was as authoritative as Ebony magazine was
in the African-American community. The reason parallel processing
was rejected and mocked was that in the 1970s
punched card computer programming of a general circulation model
was by itself a grand challenge. In the 1970s,
it was easier to travel to the moon than to program a
massively parallel processing supercomputer. My two central questions were:
First, how do I email and command each of my 65,536
commodity processors, or how do I command
as many identical computers that define a new internet
and how do I command those processors to execute an ensemble of
65,536 general circulation models
with rigorous reproducibility requirements and how do I command them
to solve the as many initial-boundary value problems
and how do I command them to solve those problems
with a one-to-one correspondence between the processors
and the as many models and how do I command them
to solve those problems simultaneously, or in parallel?
And, second, how do I instruct those 65,536 processors
and how do I instruct them to email the numerical answers
to the initial-boundary value problems and to email them synchronously,
or in parallel? Asking such parallel processing questions
seemed ludicrous to the 25,000 supercomputer scientists
of the 1970s and ‘80s. In those two decades,
parallel processing was an unproven technology.
Massively parallel processing demanded my intimate, explicit,
and exact knowledge of the positions of each of all of my
64 binary thousand processors.
In totality, the topological positions of my processors
outlined and defined the specific massively parallel
supercomputer that I programmed.
The configuration that had 65,536
processors made the news headlines in 1989.
For me—Philip Emeagwali— and as a lone wolf
massively parallel supercomputer scientist, communicating and computing
and doing both across a new internet
demanded that I know the one binary million zeroes
and ones that re-defined that internet
as the supercomputer hopeful. I defined that new supercomputer
not as a massively parallel processing machine per se
but as a new internet, de facto. I visualized my new internet
as a global network of 64 binary thousand
commodity processors that were identical
and that were equal distances apart, or as a global network of
as many identical computers, that’s one cohesive supercomputer.
The modern supercomputer of today existed in the terra incognita
of that internet technology, which was the unknown world
of supercomputing of the 1970s and ‘80s. [Wild applause and cheering for 17 seconds] Insightful and brilliant lecture