437 private links
Tue 13 Feb 2024 // 11:17 UTC
OBIT Polymath, pioneering developer of software and hardware, a prolific writer, and true old-school hacker John Walker has passed away.
His death was announced in a brief personal obituary on SCANALYST, a discussion forum hosted on Walker's own remarkably broad and fascinating website, Fourmilab. Its name is a playful take on Fermilab, the US physics laboratory, and fourmi, the French for "ant."
In the early days of microcomputers, everyone just invented their own user interfaces, until an Apple-influenced IBM standard brought about harmony. Then, sadly, the world forgot. In 1981, the IBM PC arrived and legitimized microcomputers as business tools, not just home playthings. The PC largely created the industry that the …
COMMENTS
Wade Miller
@WadeMiller_USMC
·
Follow
Here @MSNBC helpfully makes it clear their disdain for Christians in America.
She says that if you believe that your rights come from God, you aren’t a Christian, you are a Christian nationalist.
Somehow they seem to not mention that our own founding documents make this… Show more
1:08 PM · Feb 23, 2024 //
According to the Founding Fathers, our rights came from God.
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.
Per The Rights of the Colonists:
These [rights] may be best understood by reading and carefully studying the institutes of the great Law Giver and Head of the Christian Church, which are to be found clearly written and promulgated in the New Testament.
Lastly, per John Quincy Adams:
[T]he Declaration of Independence first organized the social compact on the foundation of the Redeemer’s mission upon earth. …[and] laid the cornerstone of human government upon the first precepts of Christianity.
They float the term "Christian nationalist" to scare the public from those who believe their Lord and Savior is Jesus Christ, and that, yes, our rights do come from God. The majority, if not the overwhelming majority of Christians believe that, whether they identify as nationalists or not. //
If Christianity ever becomes the minority in America, you will never hear about the religion from left-wing networks again because they will have achieved their goal.
We've had all kinds of men as president, but only one of them started a world war, prevented a military coup against the government of the United States, and used either a sword or pistol to convince a deputy sheriff that he had urgent business back at the office; that was George Washington, the original American badass.
Hobbes OS/2 Archive: "As of April 15th, 2024, this site will no longer exist."
In a move that marks the end of an era, New Mexico State University (NMSU) recently announced the impending closure of its Hobbes OS/2 Archive on April 15, 2024. For over three decades, the archive has been a key resource for users of the IBM OS/2 operating system and its successors, which once competed fiercely with Microsoft Windows. //
Archivists such as Jason Scott of the Internet Archive have stepped up to say that the files hosted on Hobbes are safe and already mirrored elsewhere. "Nobody should worry about Hobbes, I've got Hobbes handled," wrote Scott on Mastodon in early January. OS/2 World.com also published a statement about making a mirror. But it's still notable whenever such an old and important piece of Internet history bites the dust.
Like many archives, Hobbes started as an FTP site. "The primary distribution of files on the Internet were via FTP servers," Scott tells Ars Technica. "And as FTP servers went down, they would also be mirrored as subdirectories in other FTP servers. Companies like CDROM.COM / Walnut Creek became ways to just get a CD-ROM of the items, but they would often make the data available at http://ftp.cdrom.com to download." //
This story was updated on January 30 to reflect that the OS/2 archive likely started in 1990, according to people who ran the Hobbes server. The university ran Hobbes on one of two NeXT machines, the other called Calvin. //
Watching the video "America's Top 10 Ugliest Aircraft" from Youtube:
At around 7:40 into the video when discussing the Vought Pirate, there are a few seconds of a picture where the Pirate was flying in formation with another aircraft that I personally find gorgeous. //
The "ugly aircraft" from your video is a Vought F6U Pirate, the aircraft furthest from the camera: you saw it flying in formation with Chance Vought Cutlass F7U-1
Anthony McAuliffe (centre) and his officers in Bastogne, Belgium, December, 1944. The commander of the U.S. Army’s 101st Airborne would go down in history for his defiant, one syllable reply to a German surrender ultimatum. //
“McAuliffe realized that some kind of answer had to be offered and he sat down to think it over. After several minutes he admitted to his officers that he did not know what to say in response.
IBM's SAA and CUA brought harmony to software design… until everyone forgot //
In the early days of microcomputers, everyone just invented their own user interfaces, until an Apple-influenced IBM standard brought about harmony. Then, sadly, the world forgot.
In 1981, the IBM PC arrived and legitimized microcomputers as business tools, not just home playthings. The PC largely created the industry that the Reg reports upon today, and a vast and chaotic market for all kinds of software running on a vast range of compatible computers. Just three years later, Apple launched the Macintosh and made graphical user interfaces mainstream. IBM responded with an obscure and sometimes derided initiative called Systems Application Architecture, and while that went largely ignored, one part of it became hugely influential over how software looked and worked for decades to come.
One bit of IBM's vast standard described how software user interfaces should look and work – and largely by accident, that particular part caught on and took off. It didn't just guide the design of OS/2; it also influenced Windows, and DOS and DOS apps, and of pretty much all software that followed. //
The problem is that developers who grew up with these pre-standardization tools, combined with various keyboardless fondleslabs where such things don't exist, don't know what CUA means. If someone's not even aware there is a standard, then the tools they build won't follow it. As the trajectories of KDE and GNOME show, even projects that started out compliant can drift in other directions.
This doesn't just matter for grumpy old hacks. It also disenfranchizes millions of disabled computer users, especially blind and visually-impaired people. You can't use a pointing device if you can't see a mouse pointer, but Windows can be navigated 100 per cent keyboard-only if you know the keystrokes – and all blind users do. Thanks to the FOSS NVDA tool, there's now a first-class screen reader for Windows that's free of charge.
Most of the same keystrokes work in Xfce, MATE and Cinnamon, for instance. Where some are missing, such as the Super key not opening the Start menu, they're easily added. This also applies to environments such as LXDE, LXQt and so on. //
Menus bars, dialog box layouts, and standard keystrokes to operate software are not just some clunky old 1990s design to be casually thrown away. They were the result of millions of dollars and years of R&D into human-computer interfaces, a large-scale effort to get different types of computers and operating systems talking to one another and working smoothly together. It worked, and it brought harmony in place of the chaos of the 1970s and 1980s and the early days of personal computers. It was also a vast step forward in accessibility and inclusivity, opening computers up to millions more people.
Just letting it fade away due to ignorance and the odd traditions of one tiny subculture among computer users is one of the biggest mistakes in the history of computing.
On Thursday, UK's Government Communications Headquarters (GCHQ) announced the release of previously unseen images and documents related to Colossus, one of the first digital computers. The release marks the 80th anniversary of the code-breaking machines that significantly aided the Allied forces during World War II. While some in the public knew of the computers earlier, the UK did not formally acknowledge the project's existence until the 2000s.
Colossus was not one computer but a series of computers developed by British scientists between 1943 and 1945. These 2-meter-tall electronic beasts played an instrumental role in breaking the Lorenz cipher, a code used for communications between high-ranking German officials in occupied Europe. The computers were said to have allowed allies to "read Hitler's mind," according to The Sydney Morning Herald. //
The technology behind Colossus was highly innovative for its time. Tommy Flowers, the engineer behind its construction, used over 2,500 vacuum tubes to create logic gates, a precursor to the semiconductor-based electronic circuits found in modern computers. While 1945's ENIAC was long considered the clear front-runner in digital computing, the revelation of Colossus' earlier existence repositioned it in computing history. (However, it's important to note that ENIAC was a general-purpose computer, and Colossus was not.)
Douglas Engelbart changed computer history forever on December 9, 1968.
A half century ago, computer history took a giant leap when Douglas Engelbart—then a mid-career 43-year-old engineer at Stanford Research Institute in the heart of Silicon Valley—gave what has come to be known as the "mother of all demos."
On December 9, 1968 at a computer conference in San Francisco, Engelbart showed off the first inklings of numerous technologies that we all now take for granted: video conferencing, a modern desktop-style user interface, word processing, hypertext, the mouse, collaborative editing, among many others.
Even before his famous demonstration, Engelbart outlined his vision of the future more than a half-century ago in his historic 1962 paper, "Augmenting Human Intellect: A Conceptual Framework."
To open the 90-minute-long presentation, Engelbart posited a question that almost seems trivial to us in the early 21st century: "If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsible—responsive—to every action you had, how much value would you derive from that?"
Of course at that time, computers were vast behemoths that were light-years away from the pocket-sized devices that have practically become an extension of ourselves.
Engelbart, who passed away in 2013, was inspired by a now-legendary essay published in 1945 by Vannevar Bush, physicist who had been in charge of the United States Office of Scientific Research and Development during World War II.
That essay, "As We May Think," speculated on a "future device for individual use, which is a sort of mechanized private file and library." It was this essay that stuck with a young Engelbart—then a Navy technician stationed in the Philippines—for more than two decades.
By 1968, Engelbart had created what he called the "oN-Line System," or NLS, a proto-Intranet. The ARPANET, the predecessor to the Internet itself, would not be established until late the following year.
Five years later, in 1973, Xerox debuted the Alto, considered to be the first modern personal computer. That, in turn served as the inspiration for both the Macintosh and Microsoft Windows, and the rest, clearly, is history.
Oct 29, 2019
Marlor_AU Ars Tribunus Angusticlavius 20y 6,800
Rector said:
LordEOD said:
Did Al Gore Say ‘I Invented the Internet’?"Despite the multitudinous derisive references to the supposed quote that continue to be proffered even today, former U.S. vice president Al Gore never claimed that he “invented” the Internet, nor did he say anything that could reasonably be interpreted that way."
Al Gore never said that he invented the internet. He said he created the internet. See direct quote above.
Gore was the driving force behind legislation that opened up various agency-specific networks and united them to create the precursor to today’s internet. Without this legislation, the siloed networks would have continued developing in parallel. From Kahn and Cerf (linked above):
“As far back as the 1970s Congressman Gore promoted the idea of high speed telecommunications as an engine for both economic growth and the improvement of our educational system. He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. Though easily forgotten, now, at the time this was an unproven and controversial concept. Our work on the Internet started in 1973 and was based on even earlier work that took place in the mid-late 1960s. But the Internet, as we know it today, was not deployed until 1983. When the Internet was still in the early stages of its deployment, Congressman Gore provided intellectual leadership by helping create the vision of the potential benefits of high speed computing and communication. As an example, he sponsored hearings on how advanced technologies might be put to use in areas like coordinating the response of government agencies to natural disasters and other crises.
As a Senator in the 1980s Gore urged government agencies to consolidate what at the time were several dozen different and unconnected networks into an "Interagency Network." Working in a bi-partisan manner with officials in Ronald Reagan and George Bush's administrations, Gore secured the passage of the High Performance Computing and Communications Act in 1991. This "Gore Act" supported the National Research and Education Network (NREN) initiative that became one of the major vehicles for the spread of the Internet beyond the field of computer science.”
If the guys who created the protocols that drive the internet are happy to credit Gore as the administrative driving force behind it, then who can argue with that?
Nathan2055 Ars Centurion 7y 355 Subscriptor
mknelson said:
tjukken said:
"Without ARPANET, there would have been no Internet"I doubt that is true.
The internet is more than just wires between computers (or Tubes if you're from Alaska). It's the protocols that make it all work.
The origins of those protocols and the hardware they work on in many ways have their origins in the early work on ARPANET
This is exactly right.
ARPANET was, indisputably, the first network to implement TCP/IP, which eventually became the backbone protocols of the modern Internet. Now, ARPANET did not originally launch with TCP/IP; it originally used the far more simplistic Network Control Program, which had severe limitations and was not standardized outside of ARPANET. The need for a single, standardized protocol for sharing network resources that could be utilized by any computer system led to the development of the Transmission Control Program in 1974. Incidentally, the RFC for this system, RFC 675, is notable as it contains the first use of the word Internet, intended as a shorthand for "internetworking."
Transmission Control Program would later be split off into the Transmission Control Protocol and the Internet Protocol, separating the transport layer and network layer for increased modularity. TCP/IP was declared the new standard for computer networking by the DoD in March 1982, and shortly after University College London did the same; and the entire network was rebooted and switched over on January 1, 1983. This is another candidate for the Internet's birthday, along with March 12, 1989 (Tim Berners-Lee submitting his original proposal for CERN to adopt an information management system based on "hypertext") , December 20, 1990 (the first hypertext web page using HTML and HTTP was published, www.cern.ch, describing the World Wide Web project), and January 1, 1990 (the first fully commercial Internet backbone, not controlled or limited in anyway by government or educational institutions, was switched on by PSInet).
That being said, I'd certainly argue that the first long-distance computer network connection, even if it wasn't using TCP/IP yet, makes the most sense to celebrate as the Internet's birthday.
RoninX Ars Tribunus Militum 12y 2,849 Subscriptor
Dibbit said:
tjukken said:
"Without ARPANET, there would have been no Internet"I doubt that is true.
I kinda concur.
While Arpanet was very early and important, the way it is framed always brings to mind the image that everything grew out of this original network.The truth is more that individual networks grew bigger and bigger and linked up. For instance, The Swiss network was already interconnected before hooking up to the "main node" so to speak.
If Arpanet hadn't been there, there would've been another big network that would've been appointed "the origin"
There would undoubtedly have been some sort of global computer network without the ARPANET, but it might have taken a very different form.
It could have evolved from something like Compuserve into a more cable-TV model, with a sharp distinction between computers that could serve information (which would need to be approved by the network supplier) and those that just accessed that information. So, like Comcast, except you would need Comcast's approval for any information you wanted to post to the network, and undoubtedly have to pay an additional fee.
Or it could have evolved from the heavily-regulated world of amateur radio, where some hobbyists were experimenting with packet radio for teletype communication -- where every user needs to have a license, and things like profanity are strictly prohibited.
Or it could have become a government bureaucracy, like the Post Office or the DMV, where the service is paid for by a combination of taxes and user fees, and all use is both licensed and tracked to individual users.
Or it could have grown out of Fidonet into an even more distributed model, where all of the networking was peer-to-peer, and evolving into a network that would have been like torrents on steroids.
Or it could have been built and owned by Microsoft and have only supported Windows PCs.
Or any one of a dozen other possibilities.
Computer networks were inevitable, but the fact that the Internet works the way it currently does -- for better or worse -- is directly a result of the architecture pioneered by the ARPANET.
The precursor to the Internet carried its first login request on October 29, 1969. //
On October 29, 1969, at 10:30pm Pacific Time, the first two letters were transmitted over ARPANET. And then it crashed. About an hour later, after some debugging, the first actual remote connection between two computers was established over what would someday evolve into the modern Internet.
Funded by the Advanced Research Projects Agency (the predecessor of DARPA), ARPANET was built to explore technologies related to building a military command-and-control network that could survive a nuclear attack. But as Charles Herzfeld, the ARPA director who would oversee most of the initial work to build ARPANET put it:
The ARPANET was not started to create a Command and Control System that would survive a nuclear attack, as many now claim. To build such a system was, clearly, a major military need, but it was not ARPA's mission to do this; in fact, we would have been severely criticized had we tried. Rather, the ARPANET came out of our frustration that there were only a limited number of large, powerful research computers in the country, and that many research investigators, who should have access to them, were geographically separated from them. //
The first letters transmitted, sent from UCLA to Stanford by UCLA student programmer Charley Kline, were "l" and "o." On the second attempt, the full message text, login, went through from the Sigma 7 to the 940. So, the first three characters ever transmitted over the precursor to the Internet were L, O, and L. //
When it was shut down, Vinton Cerf, one of the fathers of the modern Internet, wrote a poem in ARPANET's honor:
It was the first, and being first, was best,
but now we lay it down to ever rest.
Now pause with me a moment, shed some tears.
For auld lang syne, for love, for years and years
of faithful service, duty done, I weep.
Lay down thy packet, now, O friend, and sleep.
On Thursday, Internet pioneer Vint Cerf announced that Dr. David L. Mills, the inventor of Network Time Protocol (NTP), died peacefully at age 85 on January 17, 2024. The announcement came in a post on the Internet Society mailing list after Cerf was informed of David's death by Mills' daughter, Leigh.
"He was such an iconic element of the early Internet," wrote Cerf.
Dr. Mills created the Network Time Protocol (NTP) in 1985 to address a crucial challenge in the online world: the synchronization of time across different computer systems and networks. In a digital environment where computers and servers are located all over the world, each with its own internal clock, there's a significant need for a standardized and accurate timekeeping system.
NTP provides the solution by allowing clocks of computers over a network to synchronize to a common time source. This synchronization is vital for everything from data integrity to network security. For example, NTP keeps network financial transaction timestamps accurate, and it ensures accurate and synchronized timestamps for logging and monitoring network activities.
In the 1970s, during his tenure at COMSAT and involvement with ARPANET (the precursor to the Internet), Mills first identified the need for synchronized time across computer networks. His solution aligned computers to within tens of milliseconds. NTP now operates on billions of devices worldwide, coordinating time across every continent, and has become a cornerstone of modern digital infrastructure.
Evangelist of lean software and devisor of 9 programming languages and an OS was 89 //
In his work, the languages and tools he created, in his eloquent plea for smaller, more efficient software – even in the projects from which he quit – his influence on the computer industry has been almost beyond measure. The modern software industry has signally failed to learn from him. Although he has left us, his work still has much more to teach.
The story of the Comanches and the Red River War, whose 150th anniversary we mark this year, shows the absurdity of the ‘noble savage’ narrative. //
The Comanche were just as much imperialists as the Europeans ever were. Though Europeans could certainly be violently cruel, their culture at least censured violence against civilians — indeed, when stories of federal troops massacring defenseless Indians traveled east, the American people were horrified. The same cannot be said of the Comanche, whose brutality was an indelible component of their cultural identity.
It’s true as much today as it was 150 years ago that the West can learn from indigenous peoples such as the Comanche, who were not only tremendous horsemen and students of the natural world, but incredibly resourceful in finding a use for practically every part of the buffalo, which, with the horse, served as the cornerstone of their society. But that doesn’t mean we should embrace a simplistic, starry-eyed conception of native peoples, or a benighted, self-hating understanding of our own civilization.
The C programming language was devised in the early 1970s as a system implementation language for the nascent Unix operating system. Derived from the typeless language BCPL, it evolved a type structure; created on a tiny machine as a tool to improve a meager programming environment, it has become one of the dominant languages of today. This paper studies its evolution.
A short while ago, we told the story of the Boeing 757, pound-for-pound the most overpowered twin-jet passenger airliner of the jet age of aviation. It was and still is the kind of jet that can legitimately impress fighter jets with its climb-to-altitude capabilities thanks to two colossal engines. If all that's true, think of the Airbus A340 as the complete opposite. Despite sporting four engines instead of two, the A340 is notorious worldwide for being an absolute pig. For better or worse, the A340 is like a Geo Metro in the sky. //
In time, Airbus's two factions advocating for either a twin or quad-jet arrangement for its new airframe conceded four engines were more marketable internationally than two. The only question remaining was what on Earth would power the new jet. Therein lay the future A340's true weakness, its engines.
The engine in question was the Franco-American CFM International CFM56 high-bypass turbofan engine. With well north of 30,000 examples produced since 1974, the CFM56 is one of the most prolific engines of the jet age. Everything from the DC-8 to multiple Boeing 737 iterations and all of the associated military variants therein have made use of the CFM 56 over the last 50 years. //
The last of the 377 A340s delivered to airline customers was completed in 2012. With the completion of Airbus' A380 jumbo jet program in 2021, it's doubtful whether Airbus will ever again field another wide-body, quad-jet airliner again. With the industry shifting ever more towards more efficient twin-jets, the A340 will forever remain a curious footnote in aviation history. https://www.airbus.com/en/who-we-are/our-history/commercial-aircraft-history/previous-generation-aircraft/a340-family //
it's yet to see a fatal accident in three decades of commercial service.
The Notre Dame cathedral in Paris has been undergoing extensive renovation in the wake of a devastating 2019 fire. Previously hidden portions of its structure have revealed the use of iron reinforcements in the earliest phases of the cathedral's construction, making it the earliest known building of its type to do so. //
“Compared to other cathedrals, such as Reims, the structure of Notre Dame in Paris is light and elegant,” Jennifer Feltman of the University of Alabama, who was not involved in the research, told New Scientist. “This study confirms that use of iron made this lighter structure at Paris possible and thus the use of this material was crucial to the design of the first Gothic architect of Notre Dame.”