THE WEB, A MULTILINGUAL ENCYCLOPEDIA
MARIE LEBERT, 2012
TABLE OF CONTENTS
1974 > The internet “took off” 1990 > The invention of the web 1990 > The LINGUIST List 1991 > From ASCII to Unicode 1994 > Travlang, travel and languages 1995 > The Internet Dictionary Project 1995 > NetGlos, a glossary of the internet 1995 > Various languages on our screen 1995 > Global Reach, promoting localization 1996 > OneLook Dictionaries, a “fast finder” 1997 > 82.3% of the web in English 1997 > The internet, a tool for minority languages 1997 > A European terminology database 1997 > Babel Fish, a free translation software 1997 > The tools of the translation company Logos 1997 > Specialized terminology databases 1998 > The need for a “linguistic democracy” 1999 > Bilingual dictionaries in WordReference.com 1999 > The internet, a mandatory tool for translators 1999 > The need for bilingual information online 2000 > Online encyclopedias and dictionaries 2000 > The web portal yourDictionary.com 2000 > Project Gutengerg and languages 2001 > Wikipedia, a collaborative encyclopedia 2001 > UNL, a digital metalanguage project 2001 > A market for language translation software 2004 > The web 2.0, community and sharing 2007 > The ISO 639-3 standard to identify languages 2007 > Google Translate 2009 > 6,909 languages in the Ethnologue 2010 > A UNESCO atlas for endangered languages
INTRODUCTION
"The web will be an encyclopedia of the world by the world for the world. There will be no information or knowledge that anyone needs that will not be available. The major hindrance to international and interpersonal understanding, personal and institutional enhancement, will be removed. It would take a wilder imagination than mine to predict the effect of this development on the nature of humankind." (Robert Beard, founder of A Web of Online Dictionaries, september 1998)
This book is a chronology in 31 chapters from 1974 to 2010. Many thanks to all those who are quoted here, for their time and their friendship. Unless specified otherwise, the quotes are excerpts from the interviews conducted by the author during several years and published in the same collection.
1974 > THE INTERNET "TOOK OFF"
[Summary] The internet “took off” in 1974 with the creation of TCP/IP (Transmission Control Protocol / Internet Protocol) by Vinton Cerf and Bob Kahn, fifteen years before the invention of the web. The internet expanded as a network linking U.S. governmental agencies, universities and research centers, before spreading worldwide in 1983. The internet got its first boost in 1990 with the invention of the web by Tim Berners-Lee, and its second boost in 1993 with the release of Mosaic, the first browser for the general public. The Internet Society (ISOC) was founded in 1992 by Vinton Cerf to promote the development of the internet as a medium that was becoming part of our lives. There were 100 million internet users in December 1997, with one million new users per month, and 300 million users in December 2000.
***
The internet “took off” in 1974 with the creation of TCP/IP (Transmission Control Protocol / Internet Protocol) by Vinton Cerf and Bob Kahn, fifteen years before the invention of the web.
# A new medium
The internet expanded as a network linking U.S. governmental agencies, universities and research centers, before spreading worldwide in 1983.
The internet got its first boost in 1990 with the invention of the web by Tim Berners-Lee, and its second boost in 1993 with the release of Mosaic, the first browser for the general public.
Vinton Cerf founded the Internet Society (ISOC) in 1992 to promote the development of the internet as a medium that was becoming part of our lives. When interviewed by the French daily Libération on 16 January 1998, he explained that the network was doing two things. Like books, it could accumulate knowledge. But, more importantly, it presented knowledge in a way that connected it with other information whereas, in a book, information stayed isolated.
Because the web was easy to use with hyperlinks going from one document to the next, the internet could now be used by anyone, and not only by computer literate users. There were 100 million internet users in December 1997, with one million new users per month, and 300 million users in December 2000.
# A worldwide expansion
North America was leading the way in computer science and communication technology, with significant funding and cheap computers compared to Europe. A connection to the internet was much cheaper too.
In some European countries, internet users needed to surf the web at night (including the author of these lines), when phone rates by the minute were cheaper, to cut their expenses. In late 1998 and early 1999, some users in France, Germany and Italy launched a movement to boycott the internet one day per week, as a way to force internet providers and phone companies to set up a special monthly rate. This action paid off, and providers began to offer "internet rates".
In summer 1999, the number of internet users living outside the
U.S. reached 50%.
In summer 2000, the number of internet users having a mother tongue other than English also reached 50%, and went on steadily increasing then. According to statistics regularly published on the website of Global Reach, a marketing consultancy promoting internationalization and localization, they were 52.5% in summer 2001, 57% in December 2001, 59.8% in April 2002, 64.4% in September 2003 (including 34.9% non-English-speaking Europeans and 29.4% Asians), and 64.2% in March 2004 (including 37.9% non- English-speaking Europeans and 33% Asians).
Broadband became the norm over the years. Jean-Paul, webmaster of the hypermedia website cotres.net, summarized things in January 2007: “I feel that we are experiencing a ‘floating’ period between the heroic ages, when we were moving forward while waiting for the technology to catch up, and the future, when high-speed broadband will unleash forces that just begin to move, for now only in games.”
# The internet of the future
The internet of the future could be a “pervasive” network allowing us to connect in any place and at any time on any device through a single omnipresent network.
The concept of a “pervasive” network was developed by Rafi Haladjian, founder of the European company Ozone, who explained on its website in 2007 that “the new wave would affect the physical world, our real environment, our daily life in every moment. We will not access the network any more, we will live in it. The future components of this network (wired parts, non wired parts, operators) will be transparent to the final user. The network will always be open, providing a permanent connection anywhere. It will also be agnostic in terms of applications, as a network based on the internet protocols themselves.” We do look forward to this.
As for the content of the internet, Timothy Leary, a visionary writer, described it in 1994 in his book “Chaos & Cyber Culture” as gigantic glass towers containing the whole world information, with free access, through the cyberspace, not only to all books, but also to all pictures, all movies, all TV shows, and all other data. In 2011, we are not there yet, but we are getting there.
1990 > THE INVENTION OF THE WEB
[Summary] The World Wide Web was invented in 1990 by Tim Berners-Lee at CERN (European Center for Nuclear Research), Geneva, Switzerland. In 1989, Tim Berners-Lee networked documents using hypertext. In 1990, he developed the first HTTP (HyperText Transfer Protocol) server and the first web browser. In 1991, the web was operational and radically changed the way people were using the internet. Hypertext links allowed us to move from one textual or visual document to another with a simple click of the mouse. Information became interactive, thus more attractive to many users. Later on, this interactivity was further enhanced with hypermedia links that could link texts and images with video and sound. The World Wide Web Consortium (W3C) was founded in October 1994 to develop protocols for the web.
***
The World Wide Web was invented in 1990 by Tim Berners-Lee, a researcher at CERN (European Center for Nuclear Research), Geneva, Switzerland, who made the internet accessible to all.
# How the web started
In 1989, Tim Berners-Lee networked documents using hypertext. In 1990, he developed the first HTTP (HyperText Transfer Protocol) server and the first web browser. In 1991, the web was operational and made the internet accessible to all. Hypertext links allowed us to move from one textual or visual document to another with a simple click of the mouse. Information became interactive, thus more attractive to many users. Later on, this interactivity was further enhanced with hypermedia links that could link texts and images with video and sound.
Developed by NCSA (National Center for Supercomputing Applications) at the University of Illinois (USA) and distributed free of charge in November 1993, Mosaic was the first browser for the general public, and contributed greatly to the development of the web. In early 1994, part of the Mosaic team migrated to the Netscape Communications Corporation to develop a new browser called Netscape Navigator. In 1995, Microsoft launched its own browser, the Internet Explorer. Other browsers were launched then, like Opera and Safari, Apple's browser.
The World Wide Web Consortium (W3C) was founded in October 1994 to develop interoperable technologies (specifications, guidelines, software, other tools) for the web, for example specifications for markup languages (HTML, XML and others). It also acted as a forum for information, commerce, communication and collective understanding. In 1998, the section Internationalization/Localization gave access to some protocols for creating a multilingual website: HTML, base character set, new tags and attributes, HTTP, language negotiation, URLs and other identifiers including non-ASCII characters, etc.
# Tim Berners-Lee’s dream
Pierre Ruetschi, a journalist for the Swiss daily “Tribune de Genève”, asked Tim Berners-Lee on 20 December 1997: "Seven years later, are you satisfied with the way the web has evolved?". He answered that, if he was pleased with the richness and diversity of information, the web still lacked the power planned in its original design. He would like "the web to be more interactive, and people to be able to create information together", and not only to be information consumers. The web was supposed to become a "medium for collaboration, a world of knowledge that we share."
In an essay posted on his webpage, Tim Berners-Lee wrote in May 1998: "The dream behind the web is of a common information space in which we communicate by sharing information. Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished. There was a second part of the dream, too, dependent on the web being so generally used that it became a realistic mirror (or in fact the primary embodiment) of the ways in which we work and play and socialize. That was that once the state of our interactions was online, we could then use computers to help us analyze it, make sense of what we are doing, where we individually fit in, and how we can better work together." (excerpt from "The World Wide Web: A very short personal history")
# The web 2.0
According to Netcraft, a company tracking data on the internet, the number of websites went from one million (April 1997) to 10 million (February 2000), 20 million (September 2000), 30 million (July 2001), 40 million (April 2003), 50 million (May 2004), 60 million (March 2005), 70 million (August 2005), 80 million (April 2006), 90 million (August 2006) and 100 million (November 2006), with a growing number of personal websites and blogs.
The term “web 2.0” was invented in 2004 by Tim O’Reilly, a publisher of computer books, as a title for a series of conferences he was organizing. The web 2.0 may begin to answer Tim Berners-Lee’s dream as a web based on community and sharing, with many collaborative projects across borders and languages.
Fifteen years after the invention the web, Wired stated in its August 2005 issue that less than half of the web was commercial, with the other half being run by passion. As for the internet, according to the French daily Le Monde dated 19 August 2005, its three powers — ubiquity, variety and interactivity — made its potential use quasi infinite.
Robert Beard, a language teacher at Bucknell University, Pennsylvania, and the founder of A Web of Online Dictionaries in 1995, wrote as early as September 1998: "The web will be an encyclopedia of the world by the world for the world. There will be no information or knowledge that anyone needs that will not be available. The major hindrance to international and interpersonal understanding, personal and institutional enhancement, will be removed. It would take a wilder imagination than mine to predict the effect of this development on the nature of humankind."
1990 > THE LINGUIST LIST
[Summary] The LINGUIST List was founded by Anthony Rodrigues Aristar in 1990 at the University of Western Australia, with 60 subscribers, before moving to Texas A&M University in 1991, with Eastern Michigan University established as the main editing site for the list. In 1997, emails sent to the distribution list were also available on the list's own website, in several sections: the profession (conferences, linguistic associations, programs), research and research support (papers, dissertation abstracts, projects, bibliographies, topics, texts), publications, pedagogy, language resources (languages, language families, dictionaries, regional information), and computer support (fonts and software). The LINGUIST List is a component of the WWW Virtual Library for linguistics.
***
The LINGUIST List was founded by Anthony Rodrigues Aristar in 1990 at the University of Western Australia, as a mailing list for academic linguists.
With 60 subscribers, it moved to Texas A&M University in 1991, with Eastern Michigan University being established as the main editing site for the list.
In 1997, emails sent to the distribution list were also available on the list's own website, in several sections: the profession (conferences, linguistic associations, programs), research and research support (papers, dissertation abstracts, projects, bibliographies, topics, texts), publications, pedagogy, language resources (languages, language families, dictionaries, regional information), and computer support (fonts and software). The LINGUIST List is a component of the WWW Virtual Library for linguistics.
Helen Dry, co-moderator of the LINGUIST List since 1991, wrote in August 1998: "The LINGUIST List, which I moderate, has a policy of posting in any language, since it is a list for linguists. However, we discourage posting the same message in several languages, simply because of the burden extra messages put on our editorial staff. (We are not a bounce-back list, but a moderated one. So each message is organized into an issue with like messages by our student editors before it is posted.) Our experience has been that almost everyone chooses to post in English. But we do link to a translation facility that will present our pages in any of five languages; so a subscriber need not read LINGUIST in English unless s/he wishes to. We also try to have at least one student editor who is genuinely multilingual, so that readers can correspond with us in languages other than English."
She added in July 1999: "We are beginning to collect some primary data. For example, we have searchable databases of dissertation abstracts relevant to linguistics, of information on graduate and undergraduate linguistics programs, and of professional information about individual linguists. The dissertation abstracts collection is, to my knowledge, the only freely available electronic compilation in existence.
1991 > FROM ASCII TO UNICODE
[Summary] Used since the beginning of computing, ASCII (American Standard Code for Information Interchange) is a 7-bit coded character set for information interchange in English. It was published in 1963 by ANSI (American National Standards Institute). With the internet spreading worldwide, to communicate in English (and Latin) was not enough anymore. The accented characters of several European languages and characters of some other languages were taken into account from 1986 onwards with 8-bit variants of ASCII, also called extended ASCII, that provided sets of 256 characters. But problems were not over until the publication of Unicode in January 1991 as a new universal encoding system. Unicode provided "a unique number for every character, no matter what the platform, no matter what the program, no matter what the language", and could handle 65,000 characters or ideograms.
***
With the internet spreading worldwide, the use of ASCII and extended ASCII was not enough anymore, thus the need to take into account all languages with Unicode, whose first version was published in January 1991.
Used since the beginning of computing, ASCII (American Standard Code for Information Interchange) is a 7-bit coded character set for information interchange in English (and Latin). It was published in 1963 by ANSI (American National Standards Institute). The 7-bit plain ASCII, also called Plain Vanilla ASCII, is a set of 128 characters with 95 printable unaccented characters (A-Z, a- z, numbers, punctuation and basic symbols), the ones that are available on the American / English keyboard.
With computer technology spreading outside North America, the accented characters of several European languages and characters of some other languages were taken into account from 1986 onwards with 8-bit variants of ASCII, also called extended ASCII, that provided sets of 256 characters.
Brian King, director of the WorldWide Language Institute (WWLI), explained in September 1998: “Computer technology has traditionally been the sole domain of a 'techie' elite, fluent in both complex programming languages and in English — the universal language of science and technology. Computers were never designed to handle writing systems that couldn't be translated into ASCII. There wasn't much room for anything other than the 26 letters of the English alphabet in a coding system that originally couldn't even recognize acute accents and umlauts — not to mention non- alphabetic systems like Chinese. But tradition has been turned upside down. Technology has been popularized. (…)
An extension of (local) popularization is the export of information technology around the world. Popularization has now occurred on a global scale and English is no longer necessarily the lingua franca of the user. Perhaps there is no true lingua franca, but only the individual languages of the users. One thing is certain — it is no longer necessary to understand English to use a computer, nor it is necessary to have a degree in computer science. A pull from non-English-speaking computer users and a push from technology companies competing for global markets has made localization a fast growing area in software and hardware development. This development has not been as fast as it could have been. The first step was for ASCII to become extended ASCII. This meant that computers could begin to start recognizing the accents and symbols used in variants of the English alphabet — mostly used by European languages. But only one language could be displayed on a page at a time. (…)
The most recent development [in 1998] is Unicode. Although still evolving and only just being incorporated into the latest software, this new coding system translates each character into 16 bits. Whereas 8-bit extended ASCII could only handle a maximum of 256 characters, Unicode can handle over 65,000 unique characters and therefore potentially accommodate all of the world's writing systems on the computer. So now the tools are more or less in place. They are still not perfect, but at last we can surf the web in Chinese, Japanese, Korean, and numerous other languages that don't use the Western alphabet. As the internet spreads to parts of the world where English is rarely used — such as China, for example, it is natural that Chinese, and not English, will be the preferred choice for interacting with it. For the majority of the users in China, their mother tongue will be the only choice."
First published in January 1991, Unicode "provides a unique number for every character, no matter what the platform, no matter what the program, no matter what the language" (excerpt from the website). This double-byte platform-independent encoding provides a basis for the processing, storage and interchange of text data in any language. Unicode is maintained by the Unicode Consortium, with its variants UTF-8, UTF-16 and UTF-32 (UTF: Unicode Transformation Format), and is a component of the specifications of the World Wide Web Consortium (W3C). Unicode has replaced ASCII for text files on Windows platforms since 1998. Unicode surpassed ASCII on the internet in December 2007.
1994 > TRAVLANG, TRAVEL AND LANGUAGES
[Summary] Travlang was the first website to offer links to free basic translation dictionaries, intended for travelers and the general public. As a first step, Michael C. Martin created in 1994 a “Foreign Languages for Travelers” section on his university website when he was a physics student in New York. One year later, he launched Travlang, a site that quickly became a major portal for travel and languages, and won a best travel site award in 1997. Travlang was still maintained in 1998 by Michel C. Martin, now an researcher in experimental physics at the Lawrence Berkeley National Laboratory in California. The section “Translating Dictionaries” gave access to free basic online dictionaries in a number of languages (Afrikaans, Czech, Danish, Dutch, Esperanto, Finnish, French, Frisian, German, Hungarian, Italian, Latin, Norwegian, Portuguese, Spanish, Swedish). Other sections offered links to language dictionaries, translation services and language schools.
***
Travlang was the first website to offer links to free basic translation dictionaries, intended for travelers and the general public.
As a first step, Michael C. Martin created in 1994 a “Foreign Languages for Travelers” section on his university website when he was a physics student in New York. One year later, he launched Travlang, a site that quickly became a major portal for travel and languages, and won a best travel site award in 1997.
Travlang was still maintained in 1998 by Michel C. Martin, now an researcher in experimental physics at the Lawrence Berkeley National Laboratory in California.
The section “Foreign Languages for Travelers” gave links to online tools to learn 60 languages. The section “Translating Dictionaries” gave access to free basic online dictionaries in a number of languages (Afrikaans, Czech, Danish, Dutch, Esperanto, Finnish, French, Frisian, German, Hungarian, Italian, Latin, Norwegian, Portuguese, Spanish, Swedish). Other sections offered links to translation services, language schools and multilingual bookstores. People could also book their hotel, car or plane ticket, look up exchange rates and browse an index of 7,000 links to other language and travel sites.
Michael C. Martin wrote in August 1998: "I think the web is an ideal place to bring different cultures and people together, and that includes being multilingual. Our Travlang site is so popular because of this, and people desire to feel in touch with other parts of the world. (…) The internet is really a great tool for communicating with people you wouldn't have the opportunity to interact with otherwise. I truly enjoy the global collaboration that has made our Foreign Languages for Travelers pages possible."
What about the future? "I think computerized full-text translations will become more common, enabling a lot of basic communications with even more people. This will also help bring the internet more completely to the non-English speaking world."
Michael C. Martin sold Travlang to GourmetMarket.com in February 1999. GourmetMarket.com sold it to iiGroup in January 2000. By July 2000, the site was pulling in two million visitors a month.
1995 > THE INTERNET DICTIONARY PROJECT
[Summary] Tyler Chambers first created the Human-Languages Page (H-LP) in May 1994 as an index of language-related internet resources in a number of languages. In 1995, Tyler launched a second project, the Internet Dictionary Project (IDP), as a collaborative project to create free online dictionaries from English to other languages (French, German, Italian, Latin, Portuguese, Spanish). As explained in 1998 on the project's website: "The Internet Dictionary Project's goal is to create royalty-free translating dictionaries through the help of the internet's citizens. This site allows individuals from all over the world to visit and assist in the translation of English words into other languages. The resulting lists of English words and their translated counterparts are then made available through this site to anyone, with no restrictions on their use."
***
In 1995, Tyler Chambers launched the Internet Dictionary Project
(IDP) as a collaborative project to create free online
dictionaries from English to other languages (French, German,
Italian, Latin, Portuguese, Spanish).
Before launching the Internet Dictionary Project, Tyler created the Human-Languages Page (H-LP) in May 1994 as an index of linguistic internet resources. In 1998, there were 1,800 language- related resources in 100 languages, with six subject listings (languages and literature, schools and institutions, linguistics resources, products and services, organizations, jobs and internships) and two category listings (dictionaries, language lessons).
What exactly was the Internet Dictionary Project? As explained in 1998 on the project's website: "The Internet Dictionary Project's goal is to create royalty-free translating dictionaries through the help of the internet's citizens. This site allows individuals from all over the world to visit and assist in the translation of English words into other languages. The resulting lists of English words and their translated counterparts are then made available through this site to anyone, with no restrictions on their use. (…)
The Internet Dictionary Project began in 1995 in an effort to provide a noticeably lacking resource to the internet community and to computing in general — free translating dictionaries. Not only is it helpful to the online community to have access to dictionary searches at their fingertips via the World Wide Web, it also sponsors the growth of computer software which can benefit from such dictionaries — from translating programs to spelling- checkers to language-education guides and more. By facilitating the creation of these dictionaries online by thousands of anonymous volunteers all over the internet, and by providing the results free-of-charge to anyone, the Internet Dictionary Project hopes to leave its mark on the internet and to inspire others to create projects which will benefit more than a corporation's gross income."
Tyler wrote in September 1998 in an email interview: "Multilingualism on the web was inevitable even before the medium 'took off', so to speak. 1994 was the year I was really introduced to the web, which was a little while after its christening but long before it was mainstream. That was also the year I began my first multilingual web project, and there was already a significant number of language-related resources online. This was back before Netscape even existed — Mosaic was almost the only web browser, and webpages were little more than hyperlinked text documents. As browsers and users mature, I don't think there will be any currently spoken language that won't have a niche on the web, from Native American languages to Middle Eastern dialects, as well as a plethora of 'dead' languages that will have a chance to find a new audience with scholars and others alike online. (…)
While I'm not multilingual, nor even bilingual, myself, I see an importance to language and multilingualism that I see in very few other areas. (…) Overall, I think that the web has been great for language awareness and cultural issues — where else can you randomly browse for 20 minutes and run across three or more different languages with information you might potentially want to know? (…)
To say that the internet is spurring multilingualism is a bit of a misconception, in my opinion — it is communication that is spurring multilingualism and cross-cultural exchange, the internet is only the latest mode of communication which has made its way down to the (more-or-less) common person. (…) Language will become even more important than it already is when the entire planet can communicate with everyone else (via the web, chat, games, email, and whatever future applications haven't even been invented yet)."
In spring 2001, the Human-Languages Page merged with the Languages Catalog, a section of the WWW Virtual Library, to become iLoveLanguages, In September 2003, iLoveLanguages provided an index of 2,000 linguistic resources in 100 languages. As for the Internet Dictionary Project, Tyler ran out of time to maintain it and removed the ability to update the dictionaries in January 2007. People can still search the available dictionaries or download the archived files.
1995 > NETGLOS, A GLOSSARY OF THE INTERNET
[Summary] Launched in 1995 by the WorldWide Language Institute (WWLI), an institute providing language instruction via the internet, NetGlos — which stands for "Multilingual Glossary of Internet Terminology" — was compiled as a voluntary collaborative project by a number of translators and other language professionals worldwide. In September 1998, NetGlos was available in 13 languages (Chinese, Croatian, English, Dutch/Flemish, French, German, Greek, Hebrew, Italian, Maori, Norwegian, Portuguese, Spanish). As explained by Brian King, director of the WorldWide Language Institute, in September 1998: “Before a new term becomes accepted as the 'correct' one, there is a period of instability where a number of competing candidates are used. Often an English loan word becomes the starting point — and in many cases the endpoint. But eventually a winner emerges that becomes codified into published technical dictionaries as well as the everyday interactions of the non technical user.”
***
NetGlos — which stands for "Multilingual Glossary of Internet Terminology" — was compiled as a voluntary collaborative project by a number of translators and other language professionals worldwide.
NetGlos was launched in 1995 by the WorldWide Language Institute
(WWLI), an institute providing language instruction via the
internet. Three years later, NetGlos was available in 13 languages
(Chinese, Croatian, English, Dutch/Flemish, French, German, Greek,
Hebrew, Italian, Maori, Norwegian, Portuguese, Spanish).
As explained by Brian King, director of the WorldWide Language Institute, in September 1998: “Much of the technical terminology on the web is still not translated into other languages. And as we found with (…) NetGlos, the translation of these terms is not always a simple process. Before a new term becomes accepted as the 'correct' one, there is a period of instability where a number of competing candidates are used. Often an English loan word becomes the starting point — and in many cases the endpoint. But eventually a winner emerges that becomes codified into published technical dictionaries as well as the everyday interactions of the non technical user. The latest version of NetGlos is the Russian one and it should be available in a couple of weeks or so [in late September 1998]. It will no doubt be an excellent example of the ongoing, dynamic process of 'russification' of web terminology.”
How about the future? "As a company that derives its very existence from the importance attached to languages, I believe the future will be an exciting and challenging one. But it will be impossible to be complacent about our successes and accomplishments. Technology is already changing at a frenetic pace. Lifelong learning is a strategy that we all must use if we are to stay ahead and be competitive. This is a difficult enough task in an English-speaking environment. If we add in the complexities of interacting in a multilingual/multicultural cyberspace, then the task becomes even more demanding. As well as competition, there is also the necessity for cooperation — perhaps more so than ever before. The seeds of cooperation across the internet have certainly already been sown. Our NetGlos Project has depended on the goodwill of volunteer translators from Canada, U.S., Austria, Norway, Belgium, Israel, Portugal, Russia, Greece, Brazil, New Zealand and other countries. I think the hundreds of visitors we get coming to the NetGlos pages everyday is an excellent testimony to the success of these types of working relationships. I see the future depending even more on cooperative relationships — although not necessarily on a volunteer basis."
1995 > VARIOUS LANGUAGES ON OUR SCREEN
[Summary] In December 1995, Yoshi Mikami, a computer scientist at Asia Info Network in Fujisawa, Japan, created the website "The Languages of the World by Computers and the Internet", also known as the Logos Home Page or Kotoba Home Page, "to summarize there the brief history, linguistic and phonetic features, writing system and computer processing aspects for each of the six major languages of the world, in English and Japanese". As a second step, Yoshi Mikami was also the co-author (with Kenji Sekine and Nobutoshi Kohara) of "The Multilingual Web Guide" (Japanese edition), a print book published in August 1997 by O'Reilly Japan, and translated in 1998 into English, French and German.
***
In December 1995, Yoshi Mikami created the website "The Languages of the World by Computers and the Internet", also known as the Logos Home Page or Kotoba Home Page, "to summarize there the brief history, linguistic and phonetic features, writing system and computer processing aspects for each of the six major languages of the world, in English and Japanese".
Yoshi Mikami was a computer scientist at Asia Info Network in Fujisawa, Japan. As a second step, one year after launching his website, he was also the co-author (with Kenji Sekine and Nobutoshi Kohara) of "The Multilingual Web Guide" (Japanese edition), a print book published in August 1997 by O'Reilly Japan, and translated in 1998 into English, French and German.
Yoshi explained in December 1998: "My native tongue is Japanese. Because I had my graduate education in the U.S. and worked in the computer business, I became bilingual in Japanese and American English. I was always interested in languages and different cultures, so I learned some Russian, French and Chinese along the way. In late 1995, I created on the web ‘The Languages of the World by Computers and the Internet’ and tried to summarize there the brief history, linguistic and phonetic features, writing system and computer processing aspects for each of the six major languages of the world, in English and Japanese. As I gained more experience, I invited my two associates to help me write a book on viewing, understanding and creating multilingual webpages, which was published in August 1997 as 'The Multilingual Web Guide', in a Japanese edition, the world's first book on such a subject."
As for multilingualism, Yoshi added: "Thousands of years ago, in Egypt, China and elsewhere, people were more concerned about communicating their laws and thoughts not in just one language, but in several. In our modern world, most nation states have each adopted one language for their own use. I predict greater use of different languages and multilingual pages on the internet, not a simple gravitation to American English, and also more creative use of multilingual computer translation. 99% of the websites created in Japan are written in Japanese.”
1995 > GLOBAL REACH, PROMOTING LOCALIZATION
[Summary] Ten years after founding Euro-Marketing Associates, a company based in San Francisco and Paris, Bill Dunlap launched in 1995 Global Reach, a marketing consultancy helping U.S. companies to expand their internet presence into an international framework. This included translating a website into other languages, actively promoting it, and using local online banner advertising to increase local website traffic. Bill Dunlap explained in December 1998: “Promoting your website is at least as important as creating it, if not more important. You should be prepared to spend at least as much time and money in promoting your website as you did in creating it in the first place. With the Global Reach program, you can have it promoted in countries where English is not spoken, and achieve a wider audience… and more sales.”
***
In 1995, Bill Dunlap launched Global Reach, a marketing consultancy helping U.S. companies to expand their internet presence into an international framework. This included translating a website into other languages, actively promoting it, and using local online banner advertising to increase local website traffic.
Ten years earlier, Bill Dunlap founded Euro-Marketing Associates, a company based in San Francisco and Paris. He wrote in December 1998: “There are so few people in the U.S. interested in communicating in many languages — most Americans are still under the delusion that the rest of the world speaks English. However, in Europe, the countries are small enough so that an international perspective has been necessary for centuries. (…)
Since 1981, when my professional life started, I've been involved with bringing American companies in Europe. This is very much an issue of language, since the products and their marketing have to be in the languages of Europe in order for them to be visible here. Since the web became popular in 1995 or so, I have turned these activities to their online dimension, and have come to champion European e-commerce among my fellow American compatriots. Most lately at Internet World in New York, I spoke about European e- commerce and how to use a website to address the various markets in Europe. (…)
Promoting your website is at least as important as creating it, if not more important. You should be prepared to spend at least as much time and money in promoting your website as you did in creating it in the first place. With the Global Reach program, you can have it promoted in countries where English is not spoken, and achieve a wider audience… and more sales. There are many good reasons for taking the online international market seriously. Global Reach is a means for you to extend your website to many countries, speak to online visitors in their own language and reach online markets there."
Bill added in July 1999: "After a website's home page is available in several languages, the next step is the development of content in each language. A webmaster will notice which languages draw more visitors (and sales) than others, and these are the places to start in a multilingual web promotion campaign. At the same time, it is always good to increase the number of languages available on a website: just a home page translated into other languages would do for a start, before it becomes obvious that more should be done to develop a certain language branch on a website."
1996 > ONELOOK DICTIONARIES, A “FAST FINDER”
[Summary] Robert Ware launched OneLook Dictionaries in April 1996 as a "fast finder" in hundreds of online dictionaries covering various topics: business, computer/internet, medical, miscellaneous, religion, science, sports, technology, general and slang. He wrote in September 1998: "On the personal side, I was almost entirely in contact with people who spoke one language and did not have much incentive to expand language abilities. Being in contact with the entire world has a way of changing that. And changing it for the better! (…) I have been slow to start including non-English dictionaries (partly because I am monolingual). But you will now find a few included." OneLook Dictionaries could browse 2 million words from 425 dictionaries in 1998, 2.5 million words from 530 dictionaries in 2000, 5 million words from 910 dictionaries in 2003, and 19 million words from 1,060 dictionaries in 2010.
***
Robert Ware launched OneLook Dictionaries in April 1996 as a "fast finder" in hundreds of online dictionaries covering various topics: business, computer/internet, medical, miscellaneous, religion, science, sports, technology, general and slang.
He wrote in September 1998: "On the personal side, I was almost entirely in contact with people who spoke one language and did not have much incentive to expand language abilities. Being in contact with the entire world has a way of changing that. And changing it for the better! (…) I have been slow to start including non- English dictionaries (partly because I am monolingual). But you will now find a few included."
Robert Ware also wrote about a personal experience showing the internet could promote both a common language and multilingualism: "In 1994, I was working for a college and trying to install a software package on a particular type of computer. I located a person who was working on the same problem and we began exchanging email. Suddenly, it hit me… the software was written only 30 miles away but I was getting help from a person half way around the world. Distance and geography no longer mattered! OK, this is great! But what is it leading to? I am only able to communicate in English but, fortunately, the other person could use English as well as German which was his mother tongue. The internet has removed one barrier (distance) but with that comes the barrier of language.
It seems that the internet is moving people in two quite different directions at the same time. The internet (initially based on English) is connecting people all around the world. This is further promoting a common language for people to use for communication. But it is also creating contact between people of different languages and creates a greater interest in multilingualism. A common language is great but in no way replaces this need. So the internet promotes both a common language *and* multilingualism. The good news is that it helps provide solutions. The increased interest and need is creating incentives for people around the world to create improved language courses and other assistance, and the internet is providing fast and inexpensive opportunities to make them available."
OneLook Dictionaries could browse 2 million words from 425 dictionaries in 1998, 2.5 million words from 530 dictionaries in 2000, 5 million words from 910 dictionaries in 2003, and 19 million words from 1,060 dictionaries in 2010.
1997 > 82.3% OF THE WEB IN ENGLISH
[Summary] The internet was born in 1974 in the U.S. before spreading to the English-speaking community and then worldwide. This explain why it took a little while for other languages than English to be distributed. The first major study about language distribution on the web was run by Babel, a joint project from the Internet Society and Alis Technologies to contribute to the internationalization of the internet. The results were published in June 1997 in seven languages on a webpage named "Web Languages Hit Parade". The main languages available on the web were English with 82.3%, German with 4.0%, Japanese with 1.6%, French with 1.5%, Spanish with 1.1%, Swedish with 1.1%, and Italian with 1.0%. Three years later, in spring 2000, non-English-speaking internet users reached 50%, with a percentage steadily increasing then.
***
The first major study about language distribution on the web was run in 1997 by Babel, a joint project from the Internet Society and Alis Technologies to contribute to the internationalization of the internet.
The internet was born in 1974 in the U.S. before spreading to the English-speaking community and then worldwide. This explain why it took a little while for other languages than English to be distributed. People from all over the world began to have access to the internet, despite a connection that was far from cheap in a number of countries, and to post webpages in their own languages. The percentage of webpages in English slowly decreased from nearly 100% in 1983 to 85% in 1997.
“Towards communicating on the internet in any language…” was the subtitle of Babel, launched in 1997 as a plurilingual website in seven languages (English, French, German, Italian, Portuguese, Spanish, Swedish), with information about the world's languages and a typographical and linguistic glossary. A section named "The Internet and Multilingualism" gave information on how to develop a multilingual website, and how to code the "world's writing".
Babel ran the first major study relating to distribution of languages on the web. The results were published in June 1997 in seven languages on a webpage named “Web Languages Hit Parade”. The main languages of the web were English with 82.3%, German with 4.0%, Japanese with 1.6%, French with 1.5%, Spanish with 1.1%, Swedish with 1.1%, and Italian with 1.0%.
According to Randy Hobler, a consultant in internet marketing for language translation software and services, interviewed in September 1998: "85% of the content of the web in 1998 is in English and going down. This trend is driven not only by more websites and users in non-English-speaking countries, but by increasing localization of company and organization sites, and increasing use of machine translation to/from various languages to translate websites.”
Randy also explained in the same email interview: “Because the internet has no national boundaries, the organization of users is bounded by other criteria driven by the medium itself. In terms of multilingualism, you have virtual communities, for example, of what I call 'Language Nations'… all those people on the internet wherever they may be, for whom a given language is their native language. Thus, the Spanish Language nation includes not only Spanish and Latin American users, but millions of Hispanic users in the U.S., as well as odd places like Spanish-speaking Morocco."
According to Global Reach, a marketing consultancy promoting localization, there were 56 million non-English-speaking users in July 1998, with 22.4% Spanish-speaking users, 12.3% Japanese- speaking users, 14% German-speaking users and 10% French-speaking users. 15% of Europe's half a billion population spoke English as a first language, 28% didn't speak English at all, and 32% were using the web in English.
In summer 1999, the number of internet users living outside the
U.S. reached 50%.
In summer 2000, the number of internet users having a mother tongue other than English also reached 50%, and went on steadily increasing then. According to statistics regularly published online by Global Reach, they were 52.5% in summer 2001, 57% in December 2001, 59.8% in April 2002, 64.4% in September 2003 (including 34.9% non-English-speaking Europeans and 29.4% Asians), and 64.2% in March 2004 (including 37.9% non-English-speaking Europeans and 33% Asians).
1997 > THE INTERNET, A TOOL FOR MINORITY LANGUAGES
[Summary] Despite the so-called hegemony of the English language, the internet was also a good tool for minority languages, as stated by Caoimhín Ó Donnaíle, who has taught computing at the Institute Sabhal Mòr Ostaig, on the Isle of Skye, in Scotland. Caoimhín has maintained the trilingual (Scotish Gaelic, Irish Gaelic, English) college website, as the main site worldwide with information on Scottish Gaelic, with a trilingual list of European minority languages. The internet could be a tool to develop a "cultural identity" for any language, while using the English language for this, as stated by Guy Antoine, who founded Windows on Haiti in April 1998 to promote the Haitian culture and language.
***
Despite the so-called hegemony of the English language, the
internet was also a good tool for minority languages, as stated by
Caoimhín Ó Donnaíle, who has taught computing at the Institute
Sabhal Mòr Ostaig, on the Isle of Skye, in Scotland.
Caoimhín has maintained the trilingual (Scotish Gaelic, Irish Gaelic, English) college website, as the main site worldwide with information on Scottish Gaelic, with a trilingual list of European minority languages.
Interviewed in August 1998, Caoimhín saw four main points for the growth of a multilingual web: “(a) The internet has contributed and will contribute to the wildfire spread of English as a world language. (b) The internet can greatly help minority languages, but this will not happen by itself. It will only happen if people want to maintain the language as an aim in itself. (c) The web is very useful for delivering language lessons, and there is a big demand for this. (d) The Unicode (ISO 10646) character set standard is very important and will greatly assist in making the Internet more multilingual.”
How about the Gaelic language? Caoimhín wrote in May 2001: "Students do everything by computer, use Gaelic spell-checking, a Gaelic online terminology database. There are more hits on our website. There is more use of sound. Gaelic radio (both Scottish and Irish) is now available continuously worldwide via the internet. A major project has been the translation of the Opera web-browser into Gaelic — the first software of this size available in Gaelic."
What about endangered languages? "I would emphasize the point that as regards the future of endangered languages, the internet speeds everything up. If people don't care about preserving languages, the internet and accompanying globalization will greatly speed their demise. If people do care about preserving them, the internet will be a tremendous help."
Robert Beard, co-founder of the web portal yourDictionary.com, wrote in January 2000: "While English still dominates the web, the growth of monolingual non-English websites is gaining strength with the various solutions to the font problems. Languages that are endangered are primarily languages without writing systems at all (only 1/3 of the world's 6,000+ languages have writing systems). I still do not see the web contributing to the loss of language identity and still suspect it may, in the long run, contribute to strengthening it. More and more Native Americans, for example, are contacting linguists, asking them to write grammars of their language and help them put up dictionaries. For these people, the web is an affordable boon for cultural expression."
The internet could be a tool to develop a "cultural identity" for any language, while using the English language for this, as stated by Guy Antoine, who founded Windows on Haiti in April 1998 to promote the Haitian culture and language.
Guy wrote in November 1999: "In Windows on Haiti, the primary language of the site is English, but one will equally find a center of lively discussion conducted in 'Kreyòl'. In addition, one will find documents related to Haiti in French, in the old colonial Creole, and I am open to publishing others in Spanish and other languages. I do not offer any sort of translation, but multilingualism is alive and well at the site, and I predict that this will increasingly become the norm throughout the web. (…)
The internet can serve, first of all, as a repository of useful information on minority languages that might otherwise vanish without leaving a trace. Beyond that, I believe that it provides an incentive for people to learn languages associated with the cultures about which they are attempting to gather information. One soon realizes that the language of a people is an essential and inextricable part of its culture. (…) ‘Kreyòl’ (Creole for the non-initiated) is primarily a spoken language, not a widely written one. I see the web changing this situation more so than any traditional means of language dissemination."
Guy added in June 2001: "Kreyòl is the only national language of Haiti, and one of its two official languages, the other being French. It is hardly a minority language in the Caribbean context, since it is spoken by eight to ten million people. (…) I have taken the promotion of Kreyòl as a personal cause, since that language is the strongest of bonds uniting all Haitians. (…) I have created two discussion forums on my website Windows on Haiti, held exclusively in Kreyòl. One is for general discussions on just about everything but obviously more focused on Haiti's current socio-political problems. The other is reserved only to debates of writing standards for Kreyòl. Those debates have been quite spirited and have met with the participation of a number of linguistic experts. The uniqueness of these forums is their non- academic nature.”
1997 > A EUROPEAN TERMINOLOGY DATABASE
[Summary] Launched in 1997 by the Translation Service of the European Commission, Eurodicautom was a multilingual terminology database of economic, scientific, technical and legal terms and expressions, with language pairs for the eleven official languages of the European Union (Danish, Dutch, English, Finnish, French, German, Greek, Italian, Portuguese, Spanish, Swedish), and Latin. There were 120,000 daily visits on average in 2003. In late 2003, Eurodicautom announced its integration into a larger terminology database in partnership with other institutions of the European Union. The new database, called IATE (InterActive Terminology for Europe), would be available in more than 20 languages, because of the enlargement of the European Union planned in 2004. IATE was launched on the intranet of some European institutions in spring 2004 and on the internet for free in March 2007.
***
Eurodicautom was a multilingual terminology database of economic, scientific, technical and legal terms and expressions, with language pairs for the eleven official languages of the European Union, and Latin.
Eurodicautom was initially developed to assist in-house translators. A free online version was available on the web in 1997 for European Union officials and for language professionals throughout the world.
Eurodicautom covered "a broad spectrum of human knowledge", mainly relating to economy, science, technology and legislation in the European Union (EU), to answer the needs of the 15 member countries in 11 official languages (Danish, Dutch, English, Finnish, French, German, Greek, Italian, Portuguese, Spanish, Swedish), plus Latin.
The project of a larger terminology database was studied as early as 1999 to merge the existing databases for a better inter- institutional cooperation between the European organizations. The project partners were the European Commission, the European Parliament, the Council of the European Union, the Court of Justice, the European Court of Auditors, the European Economic and Social Committee, the Committee of the Regions, the European Investment Bank, the European Central Bank, and the Translation Centre for the Bodies of the European Union.
Eurodicautom had 12,000 visits a day in late 2003, when it closed to prepare for a larger terminology database that would include the databases of other official European institutions. The new database would be available in many more languages, more than 20 languages instead of 12, because of the Enlargement of the European Union planned in 2004 to include new countries from Central and Eastern Europe. The European Union went from 15 country members to 25 country members in May 2004, and 27 country members in January 2007.
IATE (InterActive Terminology for Europe) was launched in March 2007 as an eagerly free service on the web, after been launched in summer 2004 on the intranet of the participating European institutions, with 1.4 million entries in the 23 official languages of the European Union (Bulgarian, Czech, Danish, Dutch, English, Estonian, Finnish, French, German, Greek, Hungarian, Irish, Italian, Latvian, Lithuanian, Maltese, Polish, Portuguese, Romanian, Slovak, Slovene, Spanish, Swedish), plus Latin.
The website has been maintained by the Translation Center of the European Union institutions in Luxembourg. According to the IATE brochure, also available in the 23 official languages, IATE offered 8,4 million words in 2010, including 540,000 abbreviations and 130.000 expressions.
1997 > BABEL FISH, A FREE TRANSLATION SOFTWARE
[Summary] In December 1997, the search engine AltaVista launched the first free machine translation software called Babel Fish or AltaVista Translation, which could translate webpages or short texts from English into French, German, Italian, Portuguese or Spanish, and vice versa. The software was developed by Systran (an acronym for "System Translation"), a company specializing in automated language solutions. Babel Fish was a “hit” among the 12 million internet users of the time, who included more and more non- English-speaking users, and greatly contributed to a plurilingual web. Other tools were developed then by Alis Technologies, Globalink, Lernout & Hauspie and Softissimo, with free and/or paid versions available on the web.
***
In December 1997, the search engine AltaVista launched Babel Fish as the first free machine translation software from English to five other languages.
At the time, the interface of Yahoo! was available in seven languages (English, French, German, Japanese, Korean, Norwegian, Swedish), to take into account a growing number of non-English- speaking users. When a search didn't give any result in Yahoo!, it was automatically shunted to AltaVista, and vice versa.
Babel Fish, also called AltaVista Translation, could translate webpages from English into French, German, Italian, Portuguese or Spanish, and vice versa, the original page and the translation being face-to-face on the screen. Translating any short text was also possible with a “copy and paste”. The result was far from perfect but helpful, as well as instantaneous and free unlike a high-quality professional translation. Non-English-speaking users were thrilled. Babel Fish greatly contributed to a plurilingual web.
Backed up by plurilingual dictionaries with 12.5 million entries, Babel Fish was developed by Systran (an acronym for "System Translation"), a company specializing in automated language solutions. As explained on Systran’s website: "Machine translation software translates one natural language into another natural language. MT takes into account the grammatical structure of each language and uses rules to transfer the grammatical structure of the source language (text to be translated) into the target language (translated text). MT cannot replace a human translator, nor is it intended to."
Machine translation was defined as such on the website of the European Association for Machine Translation (EAMT): "Machine translation (MT) is the application of computers to the task of translating texts from one natural language to another. One of the very earliest pursuits in computer science, MT has proved to be an elusive goal, but today a number of systems are available which produce output which, if not perfect, is of sufficient quality to be useful for certain specific applications, usually in the domain of technical documentation. In addition, translation software packages which are designed primarily to assist the human translator in the production of translations are enjoying increasing popularity within professional translation organizations."
Other translation software was developed then by Alis Technologies,
Globalink, Lernout & Hauspie and Softissimo, with paid and/or free
versions available on the web. As for Babel Fish, it moved to
Yahoo!’s website in May 2008.
1997 > THE TOOLS OF THE TRANSLATION COMPANY LOGOS
[Summary] In December 1997, Logos, a global translation company based in Modena, Italy, decided to put on the web for free the professional tools used by its translators, for the internet community to be able to use them as well. These tools were the Logos Dictionary, a multilingual dictionary with 7.5 billion words (in fall 1998); the Logos Wordtheque, a multilingual library with 328 billion words extracted from translated novels, technical manuals, and other texts; the Logos Linguistic Resources, a database of 553 glossaries; and the Logos Universal Conjugator, a database for verbs in 17 languages. In 2007, the Logos Library (formerly Wordtheque) included 710 billion words, Linguistic Resources (no change of name) included 1,215 glossaries, and the Universal Conjugator (formerly Conjugation of Verbs) included verbs in 36 languages.
***
In December 1997, Logos, a global translation company, decided to put on the web all the professional tools used by its translators, for the internet community to freely use them as well.
Logos was founded by Rodrigo Vergara in 1979, with headquarters in Modena, Italy. In 1997, Logos had 300 in-house translators and 2,500 free-lance translators worldwide, who processed around 200 texts per day.
The linguistic tools available online were the Logos Dictionary, a multilingual dictionary with 7.5 billion words (in fall 1998); the Logos Wordtheque, a multilingual library with 328 billion words extracted from translated novels, technical manuals, and other texts, that could be searched by language, word, author or title; the Logos Linguistic Resources, a database of 500 glossaries; and the Logos Universal Conjugator, a database for verbs in 17 languages.
When interviewed by Annie Kahn in an article of the French daily Le Monde dated 7 December 1997, Rodrigo Vergara, head of Logos, explained: "We wanted all our translators to have access to the same translation tools. So we made them available on the internet, and while we were at it we decided to make the site open to the public. This made us extremely popular, and also gave us a lot of exposure. This move has in fact attracted many customers, and also allowed us to widen our network of translators, thanks to contacts made in the wake of the initiative."
In the same article, called “Les mots pour le dire” (The words to tell it), Annie Kahn wrote: "The Logos site is much more than a mere dictionary or a collection of links to other online dictionaries. The cornerstone is the document search program, which processes a corpus of literary texts available free of charge on the web. If you search for the definition or the translation of a word ('didactique', for example), you get not only the answer sought, but also a quote from one of the literary works containing the word (in our case, an essay by Voltaire). All it takes is a click on the mouse to access the whole text or even to order the book, including in foreign translations, thanks to a partnership agreement with the famous online bookstore Amazon.com. However, if no text containing the required word is found, the program acts as a search engine, sending the user to other web sources containing this word. In the case of certain words, you can even hear the pronunciation. If there is no translation currently available, the system calls on the public to contribute. Everyone can make suggestions, after which Logos translators check the suggested translations they receive."
Ten years later, in 2007, the Logos Library (formerly Wordtheque) included 710 billion words, Linguistic Resources (no change of name) included 1,215 glossaries, and the Universal Conjugator (formerly Conjugation of Verbs) included verbs in 36 languages.
1997 > SPECIALIZED TERMINOLOGY DATABASES
[Summary] Some international organizations have run terminology databases in their own field of expertise for their translation services. In 1997, some databases were freely available on the web, to be used by language professionals throughout the world and by the internet community at large, for example ILOTERM, maintained by the International Labor Organization (ILO), TERMITE (ITU Telecommunication Terminology Database), maintained by the International Telecommunication Union (ITU), and WHOTERM (WHO Terminology Information System), maintained by the World Health Organization (WHO).
***
In 1997, some specialized terminology databases maintained by international organizations in their own field of expertise were freely available on the web, to be used by language professionals throughout the world and by the internet community at large.
Here are three examples with ILOTERM, maintained by the
International Labor Organization (ILO), TERMITE (ITU
Telecommunication Terminology Database), maintained by the
International Telecommunication Union (ITU), and WHOTERM (WHO
Terminology Information System), maintained by the World Health
Organization (WHO).
ILOTERM is a quadrilingual (English, French, German, Spanish) terminology database maintained by the Terminology and Reference Unit of the Official Documentation Branch (OFFDOC) at the International Labor Office (ILO) in Geneva, Switzerland. As explained on its website, the primary purpose of ILOTERM is to provide solutions, reflecting current usage, to terminology issues in the social and labor fields. Terms are available in English with their French, Spanish and German equivalents. The database also includes the ILO structure and programs, official names for international institutions, national bodies and employers' and workers' organizations, and names of international meetings and symposiums.
TERMITE, which stands for “Telecommunication Terminology Database”, is a quadrilingual (English, French, Spanish, Russian) terminology database maintained by the Terminology, References and Computer Aids to Translation Section of the Conference Department at the International Telecommunication Union (ITU) in Geneva, Switzerland. This database has been built on the content of all ITU printed glossaries since 1980, and regularly updated with recent entries.
WHOTERM, which stands for “WHO Terminology Information System”, is a trilingual (English, French, Spanish) database maintained by the World Health Organization (WHO) in Geneva, Switzerland. It has included: (a) the WHO General Dictionary Index in English, with the French and Spanish equivalents; (b) three glossaries in English: Health for All, Programme Development and Management, and Health Promotion; (c) the WHO TermWatch, an awareness service from the Technical Terminology, reflecting the current WHO usage, but not necessarily terms officially approved by WHO, with links to health-related terminology.
1998 > THE NEED FOR A “LINGUISTIC DEMOCRACY”
[Summary] Brian King, director of the WorldWide Language Institute (WWLI), brought up the concept of "linguistic democracy" in September 1998: "Whereas 'mother-tongue education' was deemed a human right for every child in the world by a UNESCO report in the early '50s, 'mother-tongue surfing' may very well be the Information Age equivalent. If the internet is to truly become the Global Network that it is promoted as being, then all users, regardless of language background, should have access to it. To keep the internet as the preserve of those who, by historical accident, practical necessity, or political privilege, happen to know English, is unfair to those who don't."
***
Brian King, director of the WorldWide Language Institute (WWLI), brought up the concept of "linguistic democracy" in September 1998: "Whereas 'mother-tongue education' was deemed a human right for every child in the world by a UNESCO report in the early '50s, 'mother-tongue surfing' may very well be the Information Age equivalent.
If the internet is to truly become the Global Network that it is promoted as being, then all users, regardless of language background, should have access to it. To keep the internet as the preserve of those who, by historical accident, practical necessity, or political privilege, happen to know English, is unfair to those who don't."
For Brian King, one factor contributing to the development of a multilingual internet is the “competition for a chunk of the 'global market' by major industry players”, with “the export of information technology around the world. Popularization has now occurred on a global scale and English is no longer necessarily the lingua franca of the user. Perhaps there is no true lingua franca, but only the individual languages of the users. One thing is certain — it is no longer necessary to understand English to use a computer, nor it is necessary to have a degree in computer science. A pull from non-English-speaking computer users and a push from technology companies competing for global markets has made localization a fast growing area in software and hardware development.”
Another factor is the development of electronic commerce. “Although a multilingual web may be desirable on moral and ethical grounds, such high ideals are not enough to make it other than a reality on a small-scale. As well as the appropriate technology being available so that the non-English speaker can go, there is the impact of 'electronic commerce' as a major force that may make multilingualism the most natural path for cyberspace. Sellers of products and services in the virtual global marketplace into which the internet is developing must be prepared to deal with a virtual world that is just as multilingual as the physical world. If they want to be successful, they had better make sure they are speaking the languages of their customers!"
Founder of Euro-Marketing Associates and its virtual branch Global Reach, Bill Dunlap championed the assets of e-commerce in Europe among his fellow compatriots in the U.S., promoting the internationalization and localization of their websites. He wrote in December 1998: "There are so few people in the U.S. interested in communicating in many languages — most Americans are still under the delusion that the rest of the world speaks English. However, in Europe, the countries are small enough so that an international perspective has been necessary for centuries."
Peter Raggett, deputy-head (and then head) of the Central Library of OECD (Organization for Economic Cooperation and Development), wrote in August 1999: "I think it is incumbent on European organizations and businesses to try and offer websites in three or four languages if resources permit. In this age of globalization and electronic commerce, businesses are finding that they are doing business across many countries. Allowing French, German, Japanese speakers to easily read one's website as well as English speakers will give a business a competitive edge in the domain of electronic trading."
As the internet quickly spread worldwide, companies needed to offer bilingual, trilingual, even plurilingual websites to reach as large an audience as possible, while adapting their content to a given audience, either a country or a linguistic community. Thus the need to internationalize and localize websites, which became a major trend in the late 1990s and early 2000s, with English- language companies and organizations setting up plurilingual websites, in English and other languages, and non-English-language companies and organizations setting up websites in their own language(s) and English.
1999 > BILINGUAL DICTIONARIES IN WORDREFERENCE.COM
[Summary] Michael Kellogg created WordReference.com in 1999. He wrote much later on his website: "I started this site in 1999 in an effort to provide free online bilingual dictionaries and tools to the world. The site has grown gradually ever since to become one of the most- used online dictionaries, and the top online dictionary for its language pairs of English-Spanish, English-French, English-Italian, Spanish-French, and Spanish-Portuguese. It is consistently ranked in the top 500 most-visited websites in the world. I am proud of my history of innovation with dictionaries on the internet. Many of the features such as being able to click any word in a dictionary entry were first implemented by me.” WordReference was also provided high-quality language forums, and lighter versions of some dictionaries for mobile devices.
***
Michael Kellogg created WordReference.com in 1999 to offer free online bilingual translation dictionaries.
Much later, Michael wrote on his website: "I started this site in 1999 in an effort to provide free online bilingual dictionaries and tools to the world. The site has grown gradually ever since to become one of the most-used online dictionaries, and the top online dictionary for its language pairs of English-Spanish, English-French, English-Italian, Spanish-French, and Spanish- Portuguese. It is consistently ranked in the top 500 most-visited websites in the world. I am proud of my history of innovation with dictionaries on the internet. Many of the features such as being able to click any word in a dictionary entry were first implemented by me.”
How was the idea behind his project? “The internet has done an incredible job of bringing the world together in the last few years. Of course, one of the greatest barriers has been language. Much of the content is in English and many, many users are reading English-language webpages as a second language. I know from my own experiences with Spanish-language websites that many readers probably understand much of what they are reading, but not every single word”, thus the need for a website offering free online bilingual translation dictionaries.
In 2010, WordReference has also offered a monolingual dictionary in English as well as dictionaries from English to other languages (Arabic, Chinese, Czech, Greek, Japanese, Korean, Polish, Portuguese, Romanian, Turkish), and vice versa. For the Spanish language, there was a monolingual dictionary, a dictionary of synonyms, a Spanish-French dictionary and a Spanish-Portuguese dictionary. Conjugation tables were available for French, Italian and Spanish. Monolingual dictionaries were available for German and Russian.
WordReference Mini was a miniature version of the site to be embedded into other sites, for example sites teaching languages online. A mobile device version was available for dictionaries from English to French, Italian and Spanish, and vice versa, with other language pairs to come.
As stated by Michael Kellogg: “Today [in 2011], I have three main goals with this website. First, continue to create free online bilingual dictionaries for English to many other languages. I strive to offer translations for "all" English words, terms, idioms, sayings, etc. Second, provide the world's best language forums; and third, continue to innovate to produce the best website and tools for the world.”
1999 > THE INTERNET, A MANDATORY TOOL FOR TRANSLATORS
[Summary] The internet became a mandatory tool for translators as “a vital and endless source of information”, as stated by Marcel Grangier, the head of the French Section of Central Linguistic Services, which means he was in charge of organizing translation matters into French for the linguistic services of the Swiss government. He explained in January 1999: “To work without the internet is simply impossible now. Apart from all the tools used (email, the electronic press, services for translators), the internet is for us a vital and endless source of information in what I'd call the 'non-structured sector' of the web. For example, when the answer to a translation problem can't be found on websites presenting information in an organized way, in most cases search engines allow us to find the missing link somewhere on the network.” His services also offered an online directory called “Dictionnaires Électroniques” (Electronic Dictionaries) with links to most quality dictionaries available for free on the web.
***
The internet became a mandatory tool for translators as “a vital and endless source of information”, as stated by Marcel Grangier, the head of the French Section of Central Linguistic Services.
Marcel Grangier was in charge of organizing translation matters into French for the linguistic services of the Swiss government. He explained in January 1999: “To work without the internet is simply impossible now. Apart from all the tools used (email, the electronic press, services for translators), the internet is for us a vital and endless source of information in what I'd call the 'non-structured sector' of the web. For example, when the answer to a translation problem can't be found on websites presenting information in an organized way, in most cases search engines allow us to find the missing link somewhere on the network.
Our website was first conceived as an intranet service for translators in Switzerland, who often deal with the same kind of material as the Federal government's translators. Some parts of it are useful to any translators, wherever they are. The section "Dictionnaires Électroniques" [Electronic Dictionaries] is only one section of the website. Other sections deal with administration, law, the French language, and general information. The site also hosts the pages of the Conference of Translation Services of European States (COTSOES).”
"Dictionnaires Électroniques" is a extensive directory of free dictionaries available online, with five main sections: abbreviations and acronyms, monolingual dictionaries, bilingual dictionaries, multilingual dictionaries, and geographical information. The index could also be searched by keywords. It was later transferred on the new website of COTSOES.
According to Marcel Grangier, “we can see multilingualism on the internet as a happy and irreversible inevitability. So we have to laugh at the doomsayers who only complain about the supremacy of English. Such supremacy is not wrong in itself, because it is mainly based on statistics (more PCs per inhabitant, more people speaking English, etc.). The answer is not to 'fight' English, much less whine about it, but to build more sites in other languages. As a translation service, we also recommend that websites be multilingual. (…) The increasing number of languages on the internet is inevitable and can only boost multicultural exchanges. For this to happen in the best possible circumstances, we still need to develop tools to improve compatibility. Fully coping with accents and other characters is only one example of what can be done."
Maria Victoria Marinetti was a translator from French to Spanish living near Geneva, Switzerland, with a doctorate in engineering from Mexico. She wrote in August 1999: “I have access to a large number of global information, which is very interesting for me. I can also regularly send or receive files back and forth. The internet allows be to receive or send general and technical translations from French into Spanish, and vice versa, and to correct texts in Spanish. In the technical or chemical fields, I offer a technical assistance, as well as information about exporting high-tech equipment to Mexico or to other Latin American countries.”
As for multilingualism, "it is very important to be able to communicate in various languages. I would even say this is mandatory, because the information given on the internet is meant for the whole world, so why wouldn't we get this information in our language or in the language we wish? Worldwide information, but no broad choice for languages, this would be quite a contradiction, wouldn't it?"
In 2000, the internet was multilingual, with half of its users having a mother tongue other than English, but the language barrier was far from gone. If any language was now available on the web, many users were monolingual, and even plurilingual users couldn’t read all languages. Bridges were needed between language communities to improve the flow of information in other languages, including by offering better translation software and by offering tools for all languages, and not only the dominant ones.
1999 > THE NEED FOR BILINGUAL INFORMATION ONLINE
[Summary] With the web spreading worldwide, bilingual information online became mandatory, as stated by Henk Slettenhaar, a professor in communication technologies at Webster University, Geneva, Switzerland, and a trilingual European. Henk spent his childhood in Holland, has taught his courses in English and has lived in neighboring France. He wrote in August 1999: "There are two main categories of websites in my opinion. The first one is the global outreach for business and information. Here the language is definitely English first, with local versions where appropriate. The second one is local information of all kinds in the most remote places. If the information is meant for people of an ethnic and/or language group, it should be in that language first, with perhaps a summary in English. We have seen lately how important these local websites are — in Kosovo and Turkey, to mention just the most recent ones. People were able to get information about their relatives through these sites."
***
With the web spreading worldwide, bilingual information online became mandatory, as stated by Henk Slettenhaar, a professor in communication technologies at Webster University, Geneva, Switzerland, and a trilingual European.
Henk spent his childhood in Holland, has taught his courses in English and has lived in neighboring France. He wrote in December 1998: "I see multilingualism as a very important issue. Local communities that are on the web should principally use the local language for their information. If they want to present it to the world community as well, it should be in English too. I see a real need for bilingual websites. I am delighted there are so many offerings in the original language now. I much prefer to read the original with difficulty than getting a bad translation."
Henk added in August 1999: "There are two main categories of websites in my opinion. The first one is the global outreach for business and information. Here the language is definitely English first, with local versions where appropriate. The second one is local information of all kinds in the most remote places. If the information is meant for people of an ethnic and/or language group, it should be in that language first, with perhaps a summary in English. We have seen lately how important these local websites are — in Kosovo and Turkey, to mention just the most recent ones. People were able to get information about their relatives through these sites."