- POPULAR CULTURE
- MENTAL HEALTH
- PHYSICAL SCIENCES
- BIOLOGICAL SCIENCES
- APPLIED SCIENCES
- PRINT & MEDIA
COMPUTING: DEFINED AND DESCRIBED: Part I
Computing is the activity of using computer hardware and software. As a field it could be defined in a general way as follows: "any goal-oriented activity requiring, benefiting from, or creating computers. Thus, computing includes designing and building hardware and software systems for a wide range of purposes; processing, structuring, and managing various kinds of information; doing scientific studies using computers; making computer systems behave intelligently; creating and using communications and entertainment media; finding and gathering information relevant to any particular purpose, and so on. The list is virtually endless, and the possibilities are vast." The term "computing" has sometimes been narrowly defined. In a report on computing as a discipline it was defined as: "the systematic study of algorithmic processes that describe and transform information: their theory, analysis, design, efficiency, implementation, and application."
The fundamental question underlying all computing is "What can be efficiently automated?" The meaning of "computing" depends on the context. Computing also has other meanings that are more specific, based on the context in which the term is used; for example, an information systems specialist will view computing somewhat differently from a software engineer. Regardless of the context, doing computing well can be complicated and difficult. Because society needs people to do computing well, we must think of computing not only as a profession but also as a discipline. I have been involved with computing since 1986. In 2011 I completed my first quarter century using my personal computer, my PC. Readers can scroll-down this page for a brief outline of my 25 years and the current equipment I am using.
COMPUTING: DEFINED AND DESCRIBED: Part II
The term "computing" is also synonymous with counting and calculating. In earlier times, it was used in reference to mechanical computing machines. A computer is a machine that reads, stores, manipulates and displays data. The most common examples are the various personal computers. Other common examples include: mobile phones, mp3 players, or video game consoles. A computer is a machine that manipulates data according to a set of instructions called a computer program. The program has an executable form that the computer can use directly to execute the instructions. The same program in its human-readable source code form, enables a programmer to study and develop the algorithm. Because the instructions can be carried out in different types of computers, a single set of source instructions converts to machine instructions according to the central processing unit type. The execution process carries out the instructions in a computer program. Instructions express the computations performed by the computer. They trigger sequences of simple actions on the executing machine. Those actions produce effects according to the semantics of the instructions. The average computer user, like myself, does not need to understand or remember these sorts of details. I outline them briefly here, partly for my own use, and partly for readers who come to this section of my website. For more of this general overview on computing go to: https://en.wikipedia.org/wiki/Computing
The following books were reviewed recently in several places in cyberspace. I leave it to readers with the interest to check them out: To Save Everything, Click Here: The Folly of Technological Solutionism by Evgeny Morozov, PublicAffairs; Hacking the Future: Privacy, Identity and Anonymity on the Web by Cole Stryker, Overlook; From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet by John Naughton London: Quercus; Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die by Eric Siegel, Wiley; Big Data: A Revolution That Will Transform How We Live, Work, and Think by Viktor Mayer-Schönberger and Kenneth Cukier, Eamon Dolan/Houghton Mifflin Harcourt; Status Update: Celebrity, Publicity, and Branding in the Social Media Age by Alice E. Marwick, Yale University Press; Privacy and Big Data: The Players, Regulators and Stakeholders by Terence Craig and Mary E. Ludloff, O’Reilly Media,
Early this year, as part of the $92 million “Data to Decisions” program run by the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research began evaluating computer programs designed to sift through masses of information stored, traded, and trafficked over the Internet that, when put together, might predict social unrest, terrorist attacks, and other events of interest to the military. Blog posts, e-mail, Twitter feeds, weather reports, agricultural trends, photos, economic data, news reports, demographics—each might be a piece of an emergent portrait if only there existed a suitable, algorithmic way to connect them.
DARPA, of course, is where the Internet was created, back in the late 1960s, back when it was called ARPA and the new technology that allowed packets of information to be sent from one computer to another was called the ARPANET. In 1967, when the ARPANET was first conceived, computers were big, expensive, slow by today’s standards) and resided primarily in universities and research institutions; neither Moore’s law—that processing power doubles every two years—nor the microprocessor, which was just being developed, had yet delivered personal computers to home, school, and office desktops.
Two decades later, a young British computer scientist at the European Organization for Nuclear Research named Tim Berners-Lee was looking for a way to enable CERN scientists scattered all over the world to share and link documents. When he conceived of the World Wide Web in 1988, about 86 million households had personal computers, though only a small percentage were online. Built on the open architecture of the ARPANET, which allowed discrete networks to communicate with one another, the World Wide Web soon became a way for others outside of CERN, and outside of academia altogether, to share information, making the Web bigger and more intricate with an ever-increasing number of nodes and links. By 1994, when the World Wide Web had grown to ten million users, “traffic was equivalent to shipping the entire collected works of Shakespeare every second.”
1994 was a seminal year in the life of the Internet. In a sense, it’s the year the Internet came alive, animated by the widespread adoption of the first graphical browser, Mosaic. Before the advent of Mosaic, and later Internet Explorer, Safari, Firefox, and Chrome, to name a few, information shared on the Internet was delivered in lines of visually dull, undistinguished, essentially static text. Mosaic made all those lines of text more accessible, adding integrated graphics and clickable links, opening up the Internet to the average, non-geeky user, not simply as an observer but as an active, creative participant. “Mosaic’s charming appearance encourages users to load their own documents onto the Net, including color photos, sound bites, video clips, and hypertext ‘links’ to other documents,” Gary Wolfe wrote in Wired that year.
Social media are computer-mediated tools that allow people to create, share or exchange information, ideas, & photos/videos in virtual communities and networks. Social media is defined as "a group of Internet-based applications that build on the ideological and technological foundations of Web 2.0, and that allow the creation and exchange of user-generated content." Furthermore, social media depend on mobile and web-based technologies to create highly interactive platforms through which individuals and communities share, co-create, discuss, and modify user-generated content. They introduce substantial and pervasive changes to communication between businesses,organizations, communities, and individuals. These changes are the focus of the emerging field of technoself studies. Social media differ from traditional or industrial media in many ways including: quality, reach, frequency, usability, immediacy and permanence. Social media operates in a dialogic transmission system, with many sources to many receivers). This is in contrast to traditional media that operates under a monologic transmission model with one source to many receivers.
"Social media has been broadly defined to refer to 'the many relatively inexpensive and widely accessible electronic tools that enable anyone to publish and access information, collaborate on a common effort, or build relationships.'" For a significant number of users social media has become a constant accompaniment to everyday life—a permanently unfolding self-narrative. For more on this subject go to: https://en.wikipedia.org/wiki/Social_media
A COMPELLING AUTHORITY
According to Ian Douglas, the director of the power foundation, <www.powerfoundation.org>, a visiting scholar at the Watson Institute for International Studies at Brown University, and currently working on a project entitled "The Birth of Biokinetic Society", which is part of a broader project entitled "On the Genealogy of Globalism", the use of the term 'globalization' intensified in the early to mid-1960s, at the same time my pioneering-travelling life began. From 1967 to 1974 I travelled from the Canadian Arctic to the south of Tasmania. At that time I knew nothing about globalization as a term but, by my late teens, I had come to the conclusion that if our world was to survive it would require a global system, a global federation.
Globalization was accompanied by the rise of a transnational technocracy, global governance institutions, a shift from production and trade to finance and private capital in a new system of international finance in the central world political economy, an economy connecting the planet with telecommunications and computers, among a range of other shifts and changes according to Douglas. Douglas also quotes the French philosopher, social theorist, and historian of ideas, Michel Foucault(1926-1984) to describe the human being living during this time at the end of the twentieth century and the early twenty-first as one who tries to invent himself, as one who is transformed by the technologies he employs, as the person at the centre of his own life-world, at the centre of his own biography where the self is continually monitored by examining the environment.
In this context "true myth presents its images and its imaginary actors with a compelling authority." It is "an overt aestheticising and ordering of the world." Douglas quotes the German philosopher Ernst Cassirir(1874-1945) to say that "language, poetry, art, and religion…are in their origin bound up with mythical elements." Myth is a means of acting on the present and it is the myth in its entirety which is alone important. -Ron Price with thanks to Ian Douglas, The Myth of Globalization, Online Filename: mg.pdf, 1997.
I've been telling you for years,
we need new forms of the social,
common myths, common stories,
new myths, for myths are dialogue,
technologies of the self, historical
necessities, defining moments in
time to tell us something has ended
in these years, these months, these
days, when we crossed a bridge to
which we shall never ever return.(1)
I've been telling you
I've got a myth here:
intact, total, detailed,
an overt aestheticizer,
an orderer of my world,
bound up as it is with
language, science, art,
poetry-the whole thing-
putting me at the centre,
monitoring each day's
invention with images
& actors of a compelling
authority from another
world, a world unseen.
(1) Universal House of Justice, Ridvan BE 157
27/4/'01 to 22/3/'13.
HISTORY OF COMPUTING
The first electronic digital computers were developed in the mid-20th century (1940–1945). Originally, they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs). In this era mechanical analog computers were used for military applications. Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile devices, and mobile computers can be powered by small batteries. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from mp3 players to fighter aircraft and from toys to industrial robots are the most numerous. For a detailed history of computing in the following four areas go to this link:http://en.wikipedia.org/wiki/Computer
1 History of computing
1.1 Limited-function early computers
1.2 First general-purpose computers
1.3 Stored-program architecture
1.4 Semiconductors and microprocessors
For a review of the book Turing’s Cathedral: The Origins of the Digital Universe by George Dyson go to:http://www.nybooks.com/articles/archives/2012/jun/07/how-computers-exploded/
For more information in relation to the following aspects of computing go to this link:http://en.wikipedia.org/wiki/Computer
2.1 Stored program architecture
2.3 Machine code
2.4 Programming language
2.4.1 Low-level languages
2.4.2 Higher-level languages
2.5 Program design
3.1 Control unit
3.2 Arithmetic logic unit (ALU)
3.4 Input/output (I/O)
3.7 Networking and the Internet
3.8 Computer architecture paradigms
4.1 Required technology
5 FURTHER TOPICS
5.1 Artificial intelligence
5.2.1 History of computing hardware
5.2.2 Other Hardware Topics
5.5 Professions and organizations
THE INTERNET: 1955 TO 2015
From the 1970s to the early 1990s, government & major research institutions were the predominant members of the Internet community; it was only in the early 1990s that the net was opened up to commercial and personal uses. I utilized the Internet from the early 1990s while I was a lecturer in General Studies and Human Services in the Swan College of Tafe in Western Australia, now Polytechnic West-Thornlie Campus. But it was not until I retired in 1999 that I really made use of the Internet. As the 1990s slowly turned into the 2000s, the internet slowly took off too.
While government monies fueled the development of the Internet, governments were not fully cognizant of what they were paying for. The government money that was spent in developing the Internet was pouring into research labs of academics and computer researchers who, in effect, "built strains of American libertarianism, and even 1960s idealism, into the universal language of the Internet." The very DNA of the Internet, its underlying protocols, are governed by the following principles: (1) Openness: any computer or network can join the universe of networks constituting 'the Internet'; and (2) Minimalism: few computers need to join the Internet for it to function. There is, of course, much more to this story. For a useful history of the internet going back to the 1950s go to: https://en.wikipedia.org/wiki/History_of_the_Internet
THE INTERNET: THEN AND NOW
In 1969, when the internet was nothing more than an intriguing military communication network, few could have imagined the far reaching impact of the then-budding technology. You don't need me to tell you how deeply integrated the internet is in our daily lives today. It is a communication and information highway. It is a way to distribute entertainment, and consumes more of our leisure time than we may want to admit. It gives voice to the marginalized and forgotten. It is the incubator of innovation. So how much has changed since the dawn of the internet age? Just about everything. We've pulled together a quick video that runs through some of the most staggering statistics available to describe the meteoric growth of the internet. Take a look at this vodeo:http://www.whoishostingthis.com/resources/internet-history-video/
THE INTERNET: AN EVOLVING CONTEXT
Several key political decisions conditioned the rapid uptake of Internet technologies in the 1990s and 2000, and enabled the Internet to transcend the social context in which it was conceived. The internet emerged as the poster-child technology for the new globalised economy. The decisions to open the Internet to commercial traffic and to privatise key governing institutions of the Internet – decisions that were themselves influenced by dramatic shifts in the broader political and cultural climate – were key enablers, opening the way for a reinterpretation and broader adoption of Internet technologies (cf. Bollier; Hafner; Hoover; McCormick). By divorcing the Internet from the US government, these decisions also made it easier to interpret the Internet as a “universal” technology – as a global, in the sense of international, infrastructure – well before this claim had a clear technical basis.
Since the 1990s, the Internet has rapidly become an essential utility for commerce, a pervasive medium of everyday social interaction, & a charged site of political contestation. As the technology has spread, it has also provided tools to destabilise earlier understandings of legal rights to private property, personal privacy, organisational and governmental transparency, freedom of expression andfreedom of association. These tools have been used in conflicting ways: by governments seeking moreencompassing means to monitor the activities of perceived enemies, dissidents, or everyday citizens; private corporations seeking to create new forms of private property or secure older property rights claims; individuals or movements seeking new forms of political association, new forms of freedom ofexpression - or new forms of criminal activity (Benkler; Berners-Lee Weaving; Biegel; Horten; LessigCode; Mueller). For more on the subject of the evolution of the internet in a social context in a paper found in an online journal of media and culture(M/C Journal, Vol. 18, No. 2, 2015) go to: http://journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/957
The central argument of Spreadable Media: Creating Value and Meaning in a Networked Culture, a 2013 book by Henry Jenkins, Sam Ford, & Joshua Green, is: "if it doesn't spread, it's dead. The book “challenges some of the prevailing metaphors and frameworks used to describe contemporary media.” Spreadable Media targets three readerships for the book: media scholars, communication professionals, and people actively creating and sharing media content who are interested in the resulting cultural and industry changes that are occurring. In fact, the book could appeal to anyone who is interested in understanding the cultural and social changes that are occurring online & in the media environment. The final chapter, Thinking Transnationally, provides a particularly interesting analysis of the international elements of spreadable media. “Spreadability” refers to the potential—both technical and cultural—for audiences to share content for their own purposes, sometimes with the permission of rights holders, sometimes against their wishes. For more on this subject go to:http://wpengine.com/2014/02/07/book-review-spreadable-media/
With computer-mediated communication (CMC) being widely employed in all fields, a growing body of CMC research has accumulated in recent decades. The research regarding the nature of CMC has been very controversial. Is CMC task-oriented, social-emotion-oriented, or both? Based on two delineated research models in CMC, this literature review indicates that CMC is both task-and-social-emotion-oriented in nature. Specifically, this paper discusses, compares, and contrasts several major aspects of these two research models. Results indicate that both models share similarities in three areas: research methods, participants’ characteristics, and task characteristics. However, both models have differences in three other areas, including (1) theoretical foundations, (2) technology involved and experimental duration in research methodology, and (3) major findings. Suggestions for future CMC research are proposed in order to more clearly identify the nature of CMC environments. For more go to The Electronic Journal of Sociology, "What Does Research Say about the Nature of Computer-mediated Communication: Task-Oriented, Social-Emotion-Oriented, or Both?"Yuliang Liu, 2002:http://www.sociology.org/archive.html
THE COMPUTER: WORD RECOGNITION TECHNOLOGY
According to Wikipedia, Ray Kurzweil is an American author, inventor, futurist, and director of engineering at Google. Aside from futurology, he is involved in such fields as optical character recognition (OCR), text-to-speech synthesis, speech recognition technology, and electronic keyboard instruments. He is a computer engineer specializing in word recognition technology, with a side interest in bold predictions about future machines. He is not a professional neuroscientist or psychologist or philosopher. The New York Review of Books, 21/3/'13, has a review about Kurzwell's new book purporting to reveal—no less—“the secret of human thought.” Kurzweil trys to tell us, in no uncertain terms, “how to create a mind”: that is to say, he has a grand theory of the human mind, in which its secrets will be finally revealed. Go to this link for more: http://www.nybooks.com/articles/archives/2013/mar/21/homunculism/?utm_medium=email&utm_campaign=March+5+2013&utm_content=March+5+2013+CID_fa71c8cea1e7d401bc1235b6afca5706&utm_source=Email%20marketing%20software&utm_term=Homunculism
One of the most remarkable effects of the Internet is that it permits unlimited specialisation of contacts and information. If you’re looking for an out-of-print book on an esoteric subject, you can find out instantly where there are copies of it in second-hand bookstores from Iceland to Australia, compare prices and conditions, and order it in a few seconds. You can read what people all over the world have to say on any topic that interests you, join in discussion with others who share that interest, and communicate your own ideas to the scattered community of specialists even in an arbitrarily narrow field – a community that could hardly exist without this possibility of electronic identification and expression. The availability of specialised networks applies to any type of interest whatever, and they are proliferating rapidly.
Cass Sunstein, a leading American Constitutional lawyer, is duly appreciative of this expansion of possibilities, but he wants to raise the alarm about another of its consequences, like those dire warnings of possible side-effects that come with every patent cold remedy. In this short, journalistic broadside he argues that the power the Internet gives each one of us to control deliberately what information and opinions we are exposed to – to tailor a communicative world to our prior interests and convictions – is a threat to democracy. The threat is that the choices made by individuals will add up collectively to a fragmentation of society so pervasive that the public sphere will cease to exist, and we will be left with multiple like-minded subgroups whose members talk only to their fellow members and never to those with whom they disagree, and who are exposed only to the kinds of opinions they want to hear. For more on this topic go to:http://www.lrb.co.uk/v23/n13/thomas-nagel/information-cocoons
Graphic design is the methodology of visual communication, and problem-solving through the use of type, space and image. The field is considered a subset of visual communication and communication design, but sometimes the term "graphic design" is used interchangeably with these due to overlapping skills involved. Graphic designers use various methods to create & combine words, symbols, & images to create visual representations of ideas and messages. A graphic designer may use a combination of typography, visual arts and page layout techniques to produce a final result. Graphic design often refers to both the process (designing) by which the communication is created & the products (designs) which are generated. For more on this subjec5t go to: http://en.wikipedia.org/wiki/Graphic_design For a useful online journal on the subject of graphic design go to: http://issuu.com/elupton/docs/graphic-design-thinking
QUANTUM INFORMATION AND THE COMPUTER
In the summer of 2012 physicists celebrated a triumph that many consider fundamental to our understanding of the physical world: the discovery, after a multibillion-dollar effort, of the Higgs boson. Given its importance, many of those in the physics community expected the event to earn this year’s Nobel Prize in Physics. Instead, the award went to achievements in a field far less well-known and vastly less expensive: quantum information. It may not catch as many headlines as the hunt for elusive particles, but the field of quantum information may soon answer questions even more fundamental — and upsetting — than the ones that drove the search for the Higgs.
It could well usher in a radical new era of technology, one that makes today’s fastest computers look like hand-cranked adding machines. The basis for both the work behind the Higgs search and quantum information theory is quantum physics, the most accurate and powerful theory in all of science. With it we created remarkable technologies like the transistor and the laser, which, in time, were transformed into devices — computers and iPhones — that reshaped human culture. For more on "quantum information and computers" go to: http://www.nytimes.com/2012/10/14/opinion/sunday/the-possibilities-of-quantum-information.html?ref=opinion
THE FIRST COMPUTER
In Jim Holt’s review of George Dyson’s Turing’s Cathedral he falls into the trap that Dyson set for him. See: “How the Computers Exploded,” The New York Review of Books, 7 June 2012. Dyson has amplified the importance of John von Neumann’s MANIAC project to a point where Holt got the impression that it was the first useful computer and it started a revolution. It wasn’t. It didn’t. John von Neumann’s MANIAC was not the first computer. Nor was it, as Holt dubs it, the first “genuine” computer, or the first high-speed, stored-program, all-purpose computer.
John von Neumann did not invent the stored-program architecture that often bears his name. By the time the MANIAC came online, several stored-program machines were operating and actually for sale in England and the US. Any number of computer history texts will bear this out. Dyson does include these facts in his book. Yet they are mostly brushed over, in a mad love affair with all things von Neumann. So it’s not surprising that Holt’s review gets basic facts wrong. If Holt had only doubted this revisionist history enough to check another source, he could have found that the ENIAC was very busy cranking through a variety of different computational problems from 1945 to 1955 (including one for the H-bomb). By 1948 it had a stored program. In 1949 the Manchester Baby and Mark I, the EDSAC and the BINAC were running. Eckert and Mauchly had contracts in government and industry to deliver UNIVACs. The world was lousy with computers!
The idea that von Neumann was some kind of torch carrier who convinced the world that computers were important just does not wash with the facts. It does, apparently, sell books. The insiders were convinced in 1946, when the ENIAC was revealed and the description of Eckert and Mauchly’s EDVAC was disseminated under von Neumann’s name. The population was convinced in 1952, when UNIVAC predicted the election on national TV. The fact that von Neumann continues to get credit for Eckert and Mauchly’s work is maddening. Go to this link for more on this subject:http://www.nybooks.com/articles/archives/2012/sep/27/who-gets-credit-computer-exchange/
COMPUTERS: SILICON BASED AND MOLECULE BASED
As Chief Librarian and Chief Information Office of the University of Guelph, Michael Ridley spends his days integrating digital potentialities and the power of imagination with the cultural and historical resources of the library. Seeing the digital as a liminal space between the age of the alphabet and an era of post-literacy, he is transforming the mission of libraries. Gone are the days, says Ridley, where libraries primarily focus on developing collections. Today, collections are the raw materials fueling the library as a dissonance engine, an engine enabling collaborative, cross-disciplinary imaginations.We can think of technologies as metaphors for how we think. In the industrial revolution, continues Ridley, the mind was a machine and that let us conceptualize how we think and act. Digital computing let us think of the mind as a computer, as more powerful, symbolic, and so forth. But we know that the brain is more than that; it's a biochemical thing.
Computers, says Ridley, are simply an abstraction of how we think and a means to understand how we think using those technologies. Computers as we know them now will morph into something quite different. We need to track the shift from silicon based computing to molecular computing as the kind of next shift. Perhaps we won't call molecular computing "computing" given that it will be something technically much different. So the metaphor of computing is going to become sort of the horseless carriage.
It sounds almost as though computing might lose its current association with the computer; computing might return to refer to those who compute. The computers that we have around us today may just disappear. That's going to be the other piece. The capabilities that we invest in our devices will become internalized, and this will massively shift things. Computing will disappear as a visible facet of our daily lives. Computing today is outside of us as a tool but it, as I say, may just disappear. It might disappear in the same manner as electricity has for many Western citizens. Computing could shift to being a truly invisible utility that you just access when you need it, sort of like what cloud computing is like today.
The Internet may become so big that pretty soon it's going to disappear as well. It'll disappear exactly as the electrical grid did. It's just so pervasive that we won't see it anymore; electricity is like the air we breathe. I think that the danger is that we won't think of computing any more; the less we worry about it as it recesses into the crevices of our lives, the more we cede responsibility to some other group to govern for us. The Internet would just become a tool we use or an environment that we live in and maybe we would be less conscious of what going on in it, what does and doesn't happen in it, and that's significant. How much do you think that the electrical grid is ethical. For more of these ideas go to the online electronic journal interview at ctheroy.net with the chief librarian at the University of Guelph Ontario:http://www.ctheory.net/articles.aspx?id=674
THIS HUMBLE HABITATION
When a person plies their trade, their profession or some personal activity in one place for any length of time they tend to keep certain items of equipment, gadgets, tools and resources on their work table or bench, in their study or shed. If some observer with literary skills like myself to comprehensively describe his work-area, the work-area of a writer and author, poet and publisher, editor and researcher, online journalist and blogger, scholar and reader---again as I say like myself---such an observer might include in his description the following:
the writer’s desk--its size, quality and orderliness--his files, notebooks, stationary, pens and other aids, his computer, printer, sources of illumination(lamps, lights, access to daylight), photographs, paintings, pictures, objets d’art, a brief outline of his library, the writer’s attitudes to and treatment of his books, the frequency of their use; other items of furniture, technology and resources; the time spent in his study, in his micro-milieux, on a daily basis; the view out of the window and at the doorway, the sounds of the street and of nature; the cleanliness, the frequency the study is dusted and vacuumed.
There is much to describe and depending on the level of detail in the description a writer could go on for pages, but the above provides a general overview.-Ron Price, Pioneering Over Five Epochs, 20/2/'07 to 22/3/'13.
I’ve had a variety of workplaces
over the years: bedroom, lounge,
dining-room, study and now, in
these early years of late adulthood,
I have the kind of order suited to
my needs: an 18 ft. sq. desk space
with its lamp, trays and dictionary,
printer, computer, keyboard, jug
and glass of water, pens, mouse
and that lemon tree outside the
window in my wife’s lovely garden.
This place of creative tranquillity,
this humble habitation, this place
that is my study where I repose
in peace in this my retirement
far, far from the tumult of society
and its madding crowd in these
darkest hours before the dawn
where my soul can enjoy the
rendezvous with its Source and
the ventilation of a quickening,
renewing, clarifying, amplifying
wind and its rigorous effects.
20/2/'07 to 22/3/'13.
DETAILS IN RELATION TO MY COMPUTER SYSTEM
I keep the following details in relation to my computer system and I use this information from time to time when required at various internet sites at which I post. I began keeping the following notes in 2011 after owning a computer system for 25 years from 1986 to 2011. The following information includes my update on 22/3/’13. I thank Mr Geln Schreuder of George Town Tasmania for most of the hardware information below.
A. Part 1:
first computer was installed in 1986 when I was living in South Hedland in a region known as the Pilbara of Western Australia. I had several systems until the early years of the 21st century. A Dell Pentium-4 computer(model-optiplex GT280) with a Windows XP operating system was my computer system from July 2009(circa) until 19/3/’13. This system had a Java Script 5.8 for Windows XP. A Microsoft Word 2010 was installed in 2/11. It replaced a Microsoft Word 2003.
1.2 On 19/3/’13 a Windows 7 ultimate x64 operating system was installed.
1.3 A Cooler Master Elite 430 Black with window was installed by Mr Glen Schreuder of GT Computing, 6 Goulburn Street in George Town, Tasmania, 7253. This all-black box with a black interior coating has plenty of space for large graphics cards. The Elite 430 Black also features plenty of cooling options and a well-ventilated front mesh design to keep even the hottest system nice and cool.
1.4 The following hardware items were also installed by GT Computing: (i) a core-processor with a model # AMD FX-8120 8; (ii) an Asrock 970 pro3 r2 Motherboard and a Gigabyte GeForce 210 1GB; (iii) a Kingston Hyper KHX1600C9D3K2-8GX(2X4GB)DDR3 and a Thermaltake Litepower 600W; (iv) a Western Digital WD Black 1 TB WD1002FAEX; (v) a Samsung SH-224BB-BEBS SATA DVDRW Drive OEM; and (vi) a Hardware assembly software installation with backup for recovery of old files.
2. Xerox Model N17 Network Laser Printer operated until January 2010 when I got a new “HP Colour Laser Jet,” Hewlett Packard printer(model: CP1518ni). In January 2012 a Brother-Mono Lazer Printer was installed(model #: HL2270DW). It is an Auto Duplex and Wireless system.
3. My current monitor was installed in 2009. It is a View Sonic VA1903wmb (model # VS11618). It was made in China and manufactured in 8’08. It has a screen resolution of: 1440 x 800 pixels, and a screen size of: 15.7 x 10 inches or 39 x 25 cm.
4. My current keyboard is a DELL(model SK-8115). It, too, was made in China and installed on 1 July 2010.
5. My search engine is a Mozilla Firefox(nightly) for Windows 7 as of 19/3/’13. From 2/2006 to 2/2011 I used a Mozilla Firefox for Windows XP. I had a Google mail through Thunderbird. Before 2006 I used the Eudora system. As of 19/3/’13 I continued using Gmail. Mozilla Firefox is a free and open source web browser developed for Microsoft Windows.
6. I have an Adobe Reader-Version 10. On 9/10/’11 Adobe Systems Inc. released a port of Adobe Reader X (10.1)
7. I utilized three anti-virus systems from 2/2011: an AVG 9.0 anti-virus, an anti-malware, and a lava soft ad-aware. I used only a Malware-bytes system beginning on 19/3/'13.
8. Windows 7 was installed on 19/3/’13. Windows 7 is an operating system produced by Microsoft for use on personal computers, including home and business desktops. It began being manufactured on 22/7/’09; it became generally available retail and worldwide 3 and 1/2 years ago on 22/10/’09. Its predecessor was Windows Vista. Windows 7 was succeeded by Windows 8(this system needs an interactive screen) which was released for general availability just five months ago on 26/10/’12. I was advised on 19/3/’13 not to update to Windows 8 by my computer service technician here in George Town Tasmania.
B. Now on the NBN
The national broadband network(NBN) came into my personal computing and internet work six months ago now, in late September 2012. The National Broadband Network (NBN) is a national wholesale-only, open-access data network under development in Australia. Up to one gigabyte per second connections are sold to retail service providers (RSP). My service provider is Internode. They sell Internet access and other services to me as a consumer. Until my wife and I went on the NBN we had 5 gigabytes/month of information that we could download and upload--in total. And now we have 30 gigabytes/month. This is about twice as much as we use each month in the first swix months on the NBN.
The NBN network is estimated to cost A$35.9 billion to construct over a 10-year period, including an Australian Government investment of A$27.5 billion. NBN Co, a government-owned corporation, was established to design, build and operate the NBN. Construction began with a trial rollout in Tasmania in July 2010. Two years and two months later my town, George Town, joined the NBN. Mainland Australia rollout began with the first services connected in April 2011. For more on this subject go to:http://en.wikipedia.org/wiki/National_Broadband_Network
CYBERSPACE: A NEW WORD
Cyberspace is "the notional environment in which communication over computer networks occurs."The word became popular in the 1990s when the uses of the internet, networking, and digital communication were all growing dramatically and the term "cyberspace" was able to represent the many new ideas and phenomena that were emerging. The parent term of cyberspace is "cybernetics", derived from the Ancient Greek word 'kybernētēs' meaning steersman, governor, pilot, or rudder. The word cybernetics was introduced by Norbert Wiener for his pioneering work in electronic communication and control science. As a social experience, individuals can interact, exchange ideas, share information, provide social support, conduct business, direct actions, create artistic media, play games, engage in political discussion, and so on, using this global network.
They are sometimes referred to as cybernauts. The term cyberspace has become a conventional means to describe anything associated with the Internet and the diverse Internet culture. The United States government recognizes the interconnected information technology & the interdependent network of information technology infrastructures operating across this medium as part of the US national critical infrastructure. Amongst individuals on cyberspace, there is believed to be a code of shared rules and ethics mutually beneficial for all to follow, referred to as cyberethics. Many view the right to privacy as most important to a functional code of cyberethics. Such moral responsibilities go hand in hand when working online with global networks, specifically, when opinions are involved with online social experiences.
According to Chip Morningstar and F. Randall Farmer, cyberspace is defined more by the social interactions involved rather than its technical implementation. In their view, the computational medium in cyberspace is an augmentation of the communication channel between real people; the core characteristic of cyberspace is that it offers an environment that consists of many participants with the ability to affect and influence each other. They derive this concept from the observation that people seek richness, complexity, & depth within a virtual world.
"The Future Is Here" by Ligaya Mishan is a review in The New York Review of Books(2/4/'15) of William Gibson's latest book The Peripheral(Putnam, 500 pages). Gibson’s forecast the Internet back in 1982 which at the time of his writing was still inchoate, text-based, and the insular preserve of programmers. He described cyberspace as: "a consensual hallucination experienced daily by billions,” with “lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding." It was a user experience akin to mysticism, drug use, and sexual release. It hasn’t exactly turned out that way, but Gibson’s coinage of the term “cyberspace” lives on and you can read a review of Gibson's new book at: http://www.nybooks.com/articles/archives/2015/apr/02/william-gibson-future-is-here/ For more on cyberspace go to: http://en.wikipedia.org/wiki/Cyberspace
THE INTERNET: A DOWNSIDE
In Cosmos and History: The Journal of Natural and Social Philosophy, vol. 8, no. 1, 2012, Dr Glenn McLaren wrote the following essay: THE TRIUMPH OF VIRTUAL REALITY AND ITS IMPLICATIONS FOR PHILOSOPHY AND CIVILIZATION. McLaren discusses Nicholas Carr's book, The Shallows: What the Internet is Doing to Our Brains. Carr brings together an extensive and impressive body of research in psychology, neuro-science and philosophy to reveal the internet to be detrimental to our development of abilities for deep understanding and concept formation. His main argument draws on relatively recent research which reveals our brains to be highly plastic. He suggests that Marshall McLuhan was right and those arguing that technology is neutral, wrong, in that the medium of the internet and not just its content is changing its user’s brains in ways which may undermine the conditions for civilization. These are, he argues, the conditions for deep self-reflection in which humans can engage in what was for early humans an unnatural activity of deep reading and comprehension; unnatural, because it requires the relatively secure and quiet conditions provided by a civilized society to enable deep concentration without distraction, a condition associated mainly with print technology and only available to humans for a relatively short period of our history.
Alternatively, the internet is a technology designed to continually distract us; ‘an ecosystem of interruption’, as Cory Doctorow terms it. The ability of the digital screen to be sectioned into multiple presentations of information makes it a medium in which the deep participation of continual decision-making is required. It makes it, as McLuhan famously argued, an extremely ‘cold’ medium. McLuhan’s distinction between ‘hot’ and ‘cold’ media is in McLuhan M., Understanding Media: The Extensions of Man, (MIT Press, Massachusetts, 1994). Although it may be tempting to ignore those who suggest the value of the literary mind has always been exaggerated, that would be a mistake. Their arguments are another important sign of the fundamental shift taking place in society’s attitude toward intellectual achievement. Their words also make it a lot easier for people to justify that shift – to convince themselves that surfing the Web is a suitable, even superior, substitute for deep reading and other forms of calm and attentive thought. Post-literates provide the intellectual cover that allows thoughtful people to slip comfortably into the permanent state of distractedness that defines the online life. For the rest of this interesting critique of cyberspace go to: http://www.cosmosandhistory.org/index.php/journal/article/viewFile/292/462
CYBERSPACE: 1984 TO 2014
William Gibson, who invented the word “cyberspace” for his futuristic 1984 novel Neuromancer, has said that the notion came to him when he watched kids playing video games at an arcade in Vancouver. They stared into their consoles, turning knobs and pounding buttons to manipulate a universe no one else could see. They seemed to want nothing more than to vanish through the looking glass: "It seemed to me that what they wanted was to be inside the games, within the notional space of the machine. The real world had disappeared for them—it had completely lost its importance. They were in that notional space, and the machine in front of them was the brave new world." “Cyberspace” was a nonsense word. He hoped it would pass muster for his science-fictional purpose: to evoke a domain that might be created by networked computers—“a consensual hallucination experienced daily by billions of legitimate operators, in every nation.” Thirty years ago there was no such thing. For more of this 18/12/'14 review in The New York Review of Books "Today’s Dead End Kids" by James Gleick go to: http://www.nybooks.com/articles/archives/2014/dec/18/anonymous-todays-dead-end-kids/?insrc=toc The book under view is Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous by Gabriella Coleman(Verso, 450 pages, 2014)
ANOTHER REVIEW OF COLEMAN'S BOOK
"In September, photos of naked celebrities were hacked and then posted on the image-sharing website 4chan, to the shock and surprise of much of the world’s media. Gabriella Coleman would not have been surprised. On 4chan, posting nudes of strangers and celebrities happens almost every day: and this exciting, dreadful, raucous website provides the opening scene for her long-awaited story of the hacktivist collective Anonymous. The movement evolved, from a loose collective of teenaged 4channers posting porn, into one of the most interesting and unusual groups of our time, terrifying businesses, governments and individuals with their hacking and programming skills." For more of this review of Hacker, Hoaxer....... Anonymous in the Australian edition of The Guardian, 20/11/'14, by Jamie Bartlett go to: http://www.theguardian.com/books/2014/nov/19/hacker-hoaxer-whistleblower-spy-many-faces-anonymous-gabriella-coleman-review
GOOGLE GETS GOING
In September 1998, as I was getting ready to finish my last two terms as a full-time teacher in what is now a polytechnic in Western Australia, Google set up workspace in a garage in Menlo Park California. Google filed for incorporation that same month as the vernal equinox passed in Australia where I had been living for nearly 30 years. A bank account for this newly-established company was opened by Larry Page and Sergey Brin with a cheque for $100,000. These two young computer-science grad students at Stanford university had first met in 1995, and in 1997 they gave the name Google to their program and partnership. They hired Craig Silverstein in 1998 as their first employee; he was a fellow student.
In September 1998 I had six more months of teaching left as a full-time lecturer. By the time I had retired and was set-up in my sea-change town in Tasmania in September of 1999, Google was off and running in a big way. Now, 15 years later in March 2014, I draw on Google for my home library and I have posted millions of my words on the internet.-Ron Price, Pioneering over Five Epochs, 1 March 2014.
By 2004 you had 800 employees
a campus environment and an
index of eight billion web pages.
I had written three books, posted
millions of words at thousands of
sites and become famous in worlds
of nanoseconds, pixels, kilobytes,
gigsbytes really: no famous person
me--I would never be a celebrity, a
person of renown; I would remain
unobtrusive, obscure, far, far, far
from the bright lights of great renown.
David Halberstam, American Pulitzer
Prize-winning journalist and author
said that a writer should be like a
playwright putting people on stage,
putting ideas on stage, making the
reader become the audience. We live(1)
live our lives through celebrities, said
he and some, like me, live through the
world of Google & other worlds, David.
Indeed it is a very complex process just
how we all live, each and all of us, David.
(1) Halberstam was an American Pulitzer Prize-winning journalist in 1964. He became the tireless author of 20 books on topics as varied as America’s military failings in Vietnam, the deaths of firefighters at the World Trade Centre and the high-pressure world of professional basketball. He was killed in a car crash south of San Francisco in April 2007. He was 73. Mr. Halberstam came into his own as a journalist in the early 1960s and into the periphery of my life, when I was coming into my own as a Baha’i pioneer for the Canadian Bahá’í community. Halberstam was covering the nascent American war in South Vietnam for The New York Times, and I was struggling with my nascent academic career, my nascent animal impulses and my nascent religio-philosophical convictions. -Ron Price with thanks to Clyde Haberman, “David Halberstam, 73, Reporter and Author, Dies,” 24 April, 2007, The New York Times.
25/9/'08 to 25/2/'14.
A WAY OF TRAVELLING: A MODUS OPERANDI/VIVENDI
2 JOB APPLICATIONS A WEEK FOR 50 YEARS
JOB HUNTING 1955-2005
I write here about my way of travelling before the WWW came into my life in the 1990s. It was a way of travelling based on the extensive use of my resume. The information & details in my resume, a resume I no longer use in the job-hunting world, should help anyone wanting to know something about my professional background, my writing and my life. This resume might be useful for the few who still want to assess my suitability for some advertised/unadvertised employment position. I must emphasize, though, that I never apply for jobs any more. I stopped applying for FT & PT jobs over the period 1999 to 2005. I also began to cut-back my general volunteer activity by 2004. I left the world of volunteer activity, except for work in an international organization I had been associated with for half a century, the Baha’i Faith, and several other international organizations listed on LinkedIn on my profile page.
In my writing in these years of my retirement, 2006 to 2014, I travel virtually entirely in my mind. I should add, though, that I do travel a great deal in cyberspace. Computer technology had opened a new world as I headed into my 60s and 70s. And so it is that, after travelling in the world of the great new technological bird of the sky, the jet, which began its extensive movements to and from the cities of the world in the 1950s; after my own years of buying tickets to travel by air(1967-2002)-some 35 years, I never get into the sky any more. The times I travelled by air: to Baffin Island, to several cities in Canada, to Europe, to North America, to Australia, to Hong Kong, to Israel over those 35 years are now memories, happy ones that dotted my life with their landmarks of change and transition.
The years 55 to 60, 1999 to 2005, the last years of my middle age, marked a turning point for me into a much more extensive involvement in writing and, thanks to computer technology, my writing is now read by millions. Writing is, for most of its votaries, a solitary and hopefully stimulating leisure-time-part-time-full-time pursuit. Travel takes place but it is, as I say above, for the most part in my mind, my imagination and memory. In my case, now as these middle years of my late adulthood(65-75) advance incrementally writing is full-time, I write and read about 50 hours a week.(1) About 7 hours a day is all the time I can devote to my several literary pursuits due to a range of medications, vitamins and minerals which I take for: (a) my bipolar 1 disorder prescribed by my psychiatrist, (b) my prostate and moderate chronic kidney disease prescribed by my urologist and my renal physician, respectively, and (c) my general health and hygiene.
Inevitably the style of one's writing is a reflection of the person, their experience and their philosophy. I could set out my experience in an attachment, and I did so for some 50 years, 1955 to 2005, in a logical fashion in the form of a resume when necessary.(2) If, as the famous psychologist Carl Jung writes, and as the Romans in that first great city of more than a million 2000 years ago believed, we are what we do, then some of what I am could be found in that attachment. This document might seem over-the-top as they say these days since it goes on for some 30 pages, but fifty years(1955 to 2005) in the professional and non-professional job-world produces a great pile of stuff/things. This document is the last resume I used when I was in the job hunting game back in the early years of this 3rd millennium. I have updated it, of course, to include many of the writing projects I have taken-on during these first years, the first decade, of my retirement from full-time, casual and volunteer employment.
The resume has always been the piece of writing, the statement, the document, the entry ticket which, over the years, has opened up the possibilities of another adventure, another pioneering move to another town, another state or country, another location, work in another organization, another portion of my life. I'm sure that will also be the case in the years of my late adulthood(60-80) and old age(80++) should, for some reason, movement from place to place be necessary or desired. But this seems unlikely as I head into the last stages of my life. The first step was the job application and the second step, if the first was successful, was to get on a plane and go to a part of the world where you had never been and at the end of the journey would be a job interview.
People who come across this statement might like to see it as "what happens when you can travel and not have to go to work any more." In the last 14 years which have been the first years of an early retirement(1999 to 2013), I have been able to write to a much greater extent than I had been able in my early and middle adulthood(1965 to 1999) when job, family and the demands and interests of various community projects kept my nose to the grindstone as they say colloquially. And now, with the unloading of much of the volunteer work I took on from 1999-2005, with my last child having left home in 2005 and a more settled home environment on the domestic front than I've ever had, the last years of late adulthood(age 60 to 80) and old age beckon. My resume reflects this shift in my activity-base and travel is what it's all about now. But, as I say, it is travel in my head, on TV, in paintings, photos, pictures but never in those jets and their streams of energy, their booming and buzzing through the sky with their silence and their noise.
This process of frequent moves and frequent jobs was and is not everyone's style or pattern of living. I lived in 37 houses and 22 towns in the first 60 years of my life: 1943-2003. That was a good deal of travelling, let me tell you. Many millions of people live and die in the same town, city or state and their life's adventure takes place within that physical region, the confines of a relatively small place and, perhaps, a very few jobs in their lifetime. Physical movement is not essential to psychological and spiritual growth, nor is a long list of jobs, although some degree of inner change, some inner shifting is just about inevitable, or so it seems to me, especially as we have moved toward and entered this new millennium. Most of the people on Earth never get on a plane.
For many millions of people during the years 1943-2003, my years of being jobbed, the world was my oyster and the oyster of many a million in the West. It was an oyster, not so much in the manner of a tourist-oyster, although there was plenty of that, but rather in terms of working lives which came to be seen increasingly in a global context, a global oyster. This was true for me during those years in which I was looking for amusement, education and experience, some stimulating vocation and avocation, some employment security and comfort. My adventurous years of pioneering and travelling, my applying-for-job days, a particular form of travel, during the forty year period 1962-2002.
My resume, which I won't include here, has been altered many times, of course, during those forty years. It is now for the most part, as I indicated above, not used in these years of my retirement, except as an information, bio-data, vehicle for interested readers. This document is a useful backdrop for those examining my writing, especially my poetry, although some poets regard their CV, resume, bio-data, lifeline, life-story, personal background as irrelevant to their writing-work. I frequently use this resume at various website locations now on the Internet when I want to provide some introductory background on myself. I could list many new uses after forty years of only one use--to help me get a job, make more money, experience some enrichment to my life, etcetera. The use of the resume saves one from having to reinvent the wheel, so to speak.
I don't have to say it all again in resume after resume to the point of utter tedium as I did so frequently when applying for jobs, especially in the days before the email and the internet. A few clicks of one's personal electronic-computer system and some aspect of life's game goes on or comes to a quick end—and another jet appears like magic on one’s personal horizon.
During those job-hunting years 1962-2002 I applied for some four thousand jobs, an average of two a week for each of those forty years! Well, its not the best base for travel, but it is very common. This is a guesstimation, as accurate a guesstimation as I can calculate for this forty year period. The great bulk of the thousands of letters involved in this vast, detailed and, from time to time, exhausting and frustrating process, I did not keep. I did keep a small handful of perhaps half a dozen of those letters in a file in the Letters: Section VII, Sub-Section X of my autobiographical work, Pioneering Over Four Epochs. Given the thousands of hours over fifty years devoted to the job-hunting process; given the importance of this key to the pioneering venture that is my life; given the amount of paper produced and energy expended in the process; given the amount of writing done in the context of these various jobs,(3) some of the correspondence seemed to warrant a corner in the written story of my life, my autobiography.(4)
It seemed appropriate, at least it was my desire, to write this short statement fitting all those thousands of resumes into a larger context. I like to see it as 'a perspective on travel.' The things we do when we retire!(5) Reflections on one’s experience of the age of popular jet travel, the opportunity to travel in a sort of fantasy land that really took off in the 1950s when I was a child and adolescent.
1. This involves reading and research, posting and responding to others on the internet, developing my own website and writing in several genres.
2. My resume is only included with this statement when it seems appropriate or on request. My summer jobs from 1955 to 1966 did not require resumes.
3. Beginning with the summer job I had in the Canadian Peace Research Institute in 1964, I wrote an unnumbered quantity of: summaries, reports, essays, evaluations, inter alia, in my many jobs. None of that material has been kept in any of my files.
4. The Letters section of my autobiography now occupies some 25 arch-lever files and two-ring binders and covers the period 1961 to 2011. I guesstimate the collection contains about 5000 letters. This does not include these thousands of job applications and their replies. I have kept, as I say above, about half a dozen of these letters.
Note: In the last two decades, 1991 to 2011, thousands of emails have been sent to me and replies have been written but, like the job application, most have been deleted from any potential archive. For the most part these deleted emails seem to have no long term value in an archive of letters. They were deleted as quickly as they came in. Of course there are other emails, nearly all of the correspondence I have sent and received since about 1991 which would once have been in the form of letters, is now in the form of emails. They are kept in my files. A brief perusal of my files will indicate a great deal of the form of travel I am emphasizing here. ____________________
That's all folks!
SOME FINAL NOTES ON COMPUTER TECHNOLOGY
The literature now available on this subject has begun to fill libraries. It is not possible in an introductory page like this, an introduction to a sub-section of my website, to even begin to discuss the major issues and the history, the ethics and the philosophical issues involved. As of the autumnal equinox, 21/3/'13, the above details will have to suffice.