413 private links
a plan designed exclusively for those seeking the ultimate in security and longevity for their digital presence.
Safeguard your online legacy with the 100-Year Plan. This brand-new offering is for:
- Families who wish to preserve their digital assets—the stories, photos, sounds, and videos that make up their rich family history—for generations to come.
- Founders who want to protect and document their company’s past, present, and future.
- Individuals seeking a stable, flexible, and customized online home that can adapt to whatever changes the future of technology will bring. //
The 100-Year Plan isn’t just about today. It’s an investment in tomorrow. Whether you’re cementing your own digital legacy or gifting 100 years of a trusted platform to a loved one, this plan is a testament to the future’s boundless potential.
The cost is $38,000. We hope people renew. If you’re interested in learning more, fill out the form found here:
Dealing with your digital legacy after your death is a big issue, and one that requires a lot of thought and a lot of problems to be solved, so let’s break it down into smaller pieces and think about them individually. This post is primarily a collection of thoughts about dealing with the problem from the domains side, not hosting. Hosting is a problem for more posts.
The internet isn’t that old and most of the pioneers are still around. But we can see the wave coming, so let’s try to solve this problem before it breaks.
With this online TCP port scanner you can scan an IP address for open ports.
Reliable and free network scanner to analyze LAN. The program shows all network devices, gives you access to shared folders, provides remote control of computers (via RDP and Radmin), and can even remotely switch computers off. It is easy to use and runs as a portable edition. It should be the first choice for every network admin.
Despite having a population of just 1,400, until recently, Tokelau’s .tk domain had more users than any other country. Here’s why.
Douglas Engelbart changed computer history forever on December 9, 1968.
A half century ago, computer history took a giant leap when Douglas Engelbart—then a mid-career 43-year-old engineer at Stanford Research Institute in the heart of Silicon Valley—gave what has come to be known as the "mother of all demos."
On December 9, 1968 at a computer conference in San Francisco, Engelbart showed off the first inklings of numerous technologies that we all now take for granted: video conferencing, a modern desktop-style user interface, word processing, hypertext, the mouse, collaborative editing, among many others.
Even before his famous demonstration, Engelbart outlined his vision of the future more than a half-century ago in his historic 1962 paper, "Augmenting Human Intellect: A Conceptual Framework."
To open the 90-minute-long presentation, Engelbart posited a question that almost seems trivial to us in the early 21st century: "If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsible—responsive—to every action you had, how much value would you derive from that?"
Of course at that time, computers were vast behemoths that were light-years away from the pocket-sized devices that have practically become an extension of ourselves.
Engelbart, who passed away in 2013, was inspired by a now-legendary essay published in 1945 by Vannevar Bush, physicist who had been in charge of the United States Office of Scientific Research and Development during World War II.
That essay, "As We May Think," speculated on a "future device for individual use, which is a sort of mechanized private file and library." It was this essay that stuck with a young Engelbart—then a Navy technician stationed in the Philippines—for more than two decades.
By 1968, Engelbart had created what he called the "oN-Line System," or NLS, a proto-Intranet. The ARPANET, the predecessor to the Internet itself, would not be established until late the following year.
Five years later, in 1973, Xerox debuted the Alto, considered to be the first modern personal computer. That, in turn served as the inspiration for both the Macintosh and Microsoft Windows, and the rest, clearly, is history.
Oct 29, 2019
Marlor_AU Ars Tribunus Angusticlavius 20y 6,800
Rector said:
LordEOD said:
Did Al Gore Say ‘I Invented the Internet’?"Despite the multitudinous derisive references to the supposed quote that continue to be proffered even today, former U.S. vice president Al Gore never claimed that he “invented” the Internet, nor did he say anything that could reasonably be interpreted that way."
Al Gore never said that he invented the internet. He said he created the internet. See direct quote above.
Gore was the driving force behind legislation that opened up various agency-specific networks and united them to create the precursor to today’s internet. Without this legislation, the siloed networks would have continued developing in parallel. From Kahn and Cerf (linked above):
“As far back as the 1970s Congressman Gore promoted the idea of high speed telecommunications as an engine for both economic growth and the improvement of our educational system. He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. Though easily forgotten, now, at the time this was an unproven and controversial concept. Our work on the Internet started in 1973 and was based on even earlier work that took place in the mid-late 1960s. But the Internet, as we know it today, was not deployed until 1983. When the Internet was still in the early stages of its deployment, Congressman Gore provided intellectual leadership by helping create the vision of the potential benefits of high speed computing and communication. As an example, he sponsored hearings on how advanced technologies might be put to use in areas like coordinating the response of government agencies to natural disasters and other crises.
As a Senator in the 1980s Gore urged government agencies to consolidate what at the time were several dozen different and unconnected networks into an "Interagency Network." Working in a bi-partisan manner with officials in Ronald Reagan and George Bush's administrations, Gore secured the passage of the High Performance Computing and Communications Act in 1991. This "Gore Act" supported the National Research and Education Network (NREN) initiative that became one of the major vehicles for the spread of the Internet beyond the field of computer science.”
If the guys who created the protocols that drive the internet are happy to credit Gore as the administrative driving force behind it, then who can argue with that?
Nathan2055 Ars Centurion 7y 355 Subscriptor
mknelson said:
tjukken said:
"Without ARPANET, there would have been no Internet"I doubt that is true.
The internet is more than just wires between computers (or Tubes if you're from Alaska). It's the protocols that make it all work.
The origins of those protocols and the hardware they work on in many ways have their origins in the early work on ARPANET
This is exactly right.
ARPANET was, indisputably, the first network to implement TCP/IP, which eventually became the backbone protocols of the modern Internet. Now, ARPANET did not originally launch with TCP/IP; it originally used the far more simplistic Network Control Program, which had severe limitations and was not standardized outside of ARPANET. The need for a single, standardized protocol for sharing network resources that could be utilized by any computer system led to the development of the Transmission Control Program in 1974. Incidentally, the RFC for this system, RFC 675, is notable as it contains the first use of the word Internet, intended as a shorthand for "internetworking."
Transmission Control Program would later be split off into the Transmission Control Protocol and the Internet Protocol, separating the transport layer and network layer for increased modularity. TCP/IP was declared the new standard for computer networking by the DoD in March 1982, and shortly after University College London did the same; and the entire network was rebooted and switched over on January 1, 1983. This is another candidate for the Internet's birthday, along with March 12, 1989 (Tim Berners-Lee submitting his original proposal for CERN to adopt an information management system based on "hypertext") , December 20, 1990 (the first hypertext web page using HTML and HTTP was published, www.cern.ch, describing the World Wide Web project), and January 1, 1990 (the first fully commercial Internet backbone, not controlled or limited in anyway by government or educational institutions, was switched on by PSInet).
That being said, I'd certainly argue that the first long-distance computer network connection, even if it wasn't using TCP/IP yet, makes the most sense to celebrate as the Internet's birthday.
RoninX Ars Tribunus Militum 12y 2,849 Subscriptor
Dibbit said:
tjukken said:
"Without ARPANET, there would have been no Internet"I doubt that is true.
I kinda concur.
While Arpanet was very early and important, the way it is framed always brings to mind the image that everything grew out of this original network.The truth is more that individual networks grew bigger and bigger and linked up. For instance, The Swiss network was already interconnected before hooking up to the "main node" so to speak.
If Arpanet hadn't been there, there would've been another big network that would've been appointed "the origin"
There would undoubtedly have been some sort of global computer network without the ARPANET, but it might have taken a very different form.
It could have evolved from something like Compuserve into a more cable-TV model, with a sharp distinction between computers that could serve information (which would need to be approved by the network supplier) and those that just accessed that information. So, like Comcast, except you would need Comcast's approval for any information you wanted to post to the network, and undoubtedly have to pay an additional fee.
Or it could have evolved from the heavily-regulated world of amateur radio, where some hobbyists were experimenting with packet radio for teletype communication -- where every user needs to have a license, and things like profanity are strictly prohibited.
Or it could have become a government bureaucracy, like the Post Office or the DMV, where the service is paid for by a combination of taxes and user fees, and all use is both licensed and tracked to individual users.
Or it could have grown out of Fidonet into an even more distributed model, where all of the networking was peer-to-peer, and evolving into a network that would have been like torrents on steroids.
Or it could have been built and owned by Microsoft and have only supported Windows PCs.
Or any one of a dozen other possibilities.
Computer networks were inevitable, but the fact that the Internet works the way it currently does -- for better or worse -- is directly a result of the architecture pioneered by the ARPANET.
The precursor to the Internet carried its first login request on October 29, 1969. //
On October 29, 1969, at 10:30pm Pacific Time, the first two letters were transmitted over ARPANET. And then it crashed. About an hour later, after some debugging, the first actual remote connection between two computers was established over what would someday evolve into the modern Internet.
Funded by the Advanced Research Projects Agency (the predecessor of DARPA), ARPANET was built to explore technologies related to building a military command-and-control network that could survive a nuclear attack. But as Charles Herzfeld, the ARPA director who would oversee most of the initial work to build ARPANET put it:
The ARPANET was not started to create a Command and Control System that would survive a nuclear attack, as many now claim. To build such a system was, clearly, a major military need, but it was not ARPA's mission to do this; in fact, we would have been severely criticized had we tried. Rather, the ARPANET came out of our frustration that there were only a limited number of large, powerful research computers in the country, and that many research investigators, who should have access to them, were geographically separated from them. //
The first letters transmitted, sent from UCLA to Stanford by UCLA student programmer Charley Kline, were "l" and "o." On the second attempt, the full message text, login, went through from the Sigma 7 to the 940. So, the first three characters ever transmitted over the precursor to the Internet were L, O, and L. //
When it was shut down, Vinton Cerf, one of the fathers of the modern Internet, wrote a poem in ARPANET's honor:
It was the first, and being first, was best,
but now we lay it down to ever rest.
Now pause with me a moment, shed some tears.
For auld lang syne, for love, for years and years
of faithful service, duty done, I weep.
Lay down thy packet, now, O friend, and sleep.
On Thursday, Internet pioneer Vint Cerf announced that Dr. David L. Mills, the inventor of Network Time Protocol (NTP), died peacefully at age 85 on January 17, 2024. The announcement came in a post on the Internet Society mailing list after Cerf was informed of David's death by Mills' daughter, Leigh.
"He was such an iconic element of the early Internet," wrote Cerf.
Dr. Mills created the Network Time Protocol (NTP) in 1985 to address a crucial challenge in the online world: the synchronization of time across different computer systems and networks. In a digital environment where computers and servers are located all over the world, each with its own internal clock, there's a significant need for a standardized and accurate timekeeping system.
NTP provides the solution by allowing clocks of computers over a network to synchronize to a common time source. This synchronization is vital for everything from data integrity to network security. For example, NTP keeps network financial transaction timestamps accurate, and it ensures accurate and synchronized timestamps for logging and monitoring network activities.
In the 1970s, during his tenure at COMSAT and involvement with ARPANET (the precursor to the Internet), Mills first identified the need for synchronized time across computer networks. His solution aligned computers to within tens of milliseconds. NTP now operates on billions of devices worldwide, coordinating time across every continent, and has become a cornerstone of modern digital infrastructure.
Daryl's Subnet Calculator
This document is designed to give the reader a reasonable working knowledge of TCP/IP subnetting, addressing, and routing. It is not intended to be complete, or to cover all issues. This is targeted toward LAN administrators just moving to TCP/IP, however it should help anyone who wants to know a little (more) about how TCP/IP works. This document does not, generally, apply to dial-up SLIP/PPP connections.
The difference between this (a primer) and an FAQ, is that most FAQ's, in practice, tend to be question-and-answer oriented, and generally seem to try to cover ALL issues, not just the ones frequently asked about. This primer is intended as a starting point for someone who has an interest in the subject, but doesn't know where to start or what questions to ask. This should also help to broaden the understanding of people who have worked with TCP/IP for a while, but either haven't had the time to study all the less-than-useful theory behind the subject, or have been somewhat overwhelmed by the many theoretical details and have missed the big picture.
“The FCC is dishonestly claiming that it is promoting equity and fairness. However, the FCC is just seizing control over business decisions, funneling resources to politically preferred constituencies.” //
Congress, in the 2021 bipartisan infrastructure bill, delegated to the FCC the task to “ensure that all people of the United States benefit from equal access to broadband internet access.”
In fact, the agency found no evidence of intentional discrimination, but the leftists on the FCC used Congress’ delegation as an excuse to force equity and diversity mandates ranging from controls over discounts, language options, and credit checks to marketing and advertising.
The Internet started in the 1960s as a way for government researchers to share information. Computers in the '60s were large and immobile and in order to make use of information stored in any one computer, one had to either travel to the site of the computer or have magnetic computer tapes sent through the conventional postal system.
Another catalyst in the formation of the Internet was the heating up of the Cold War. The Soviet Union's launch of the Sputnik satellite spurred the U.S. Defense Department to consider ways information could still be disseminated even after a nuclear attack. This eventually led to the formation of the ARPANET (Advanced Research Projects Agency Network), the network that ultimately evolved into what we now know as the Internet. ARPANET was a great success but membership was limited to certain academic and research organizations who had contracts with the Defense Department. In response to this, other networks were created to provide information sharing.
January 1, 1983 is considered the official birthday of the Internet. Prior to this, the various computer networks did not have a standard way to communicate with each other. A new communications protocol was established called Transfer Control Protocol/Internetwork Protocol (TCP/IP). This allowed different kinds of computers on different networks to "talk" to each other. ARPANET and the Defense Data Network officially changed to the TCP/IP standard on January 1, 1983, hence the birth of the Internet. All networks could now be connected by a universal language.
SpaceX also faulted the FCC for relying on Ookla speed tests:
For instance, the Bureau's decision arbitrarily penalized SpaceX—and only SpaceX—for not meeting RDOF speed requirements years before SpaceX had any obligation to do so. The arbitrariness of applying this unstated standard exclusively to SpaceX was only compounded by the Bureau's reliance on Ookla nationwide speed tests without any notice that it planned to use such tests and even though those nationwide averages included areas that would not be served by RDOF. Even so, Starlink likely recorded the fastest speeds of any operator in the locations eligible for RDOF funds... Starlink has also deployed its service in advance of all RDOF deployment milestones and well ahead of most, if not all, RDOF awardees.
Notice the format of the hostname: ::ffff:a.b.c.d
I had to look this up: this is a IPv4-mapped IPv6 address. It is a format to describe an IPv4 address using a IPv6 address format.
From Wikipedia article on IPv6 addresses:
::ffff:0:0/96 ? This prefix is used for IPv6 transition mechanisms and designated as an IPv4-mapped IPv6 address.
With a few exceptions, this address type allows the transparent use of the transport layer protocols over IPv4 through the IPv6 networking application programming interface. In this dual-stack configuration, server applications only need to open a single listening socket to handle connections from clients using IPv6 or IPv4 protocols. IPv6 clients are handled natively by default, and IPv4 clients appear as IPv6 clients at their IPv4-mapped IPv6 address. Transmission is handled similarly; established sockets may be used to transmit IPv4 or IPv6 datagram, based on the binding to an IPv6 address, or an IPv4-mapped address.
The IPv4 part of the address can also be represented in hexadecimal: ::ffff:aabb:ccdd
Instead of carefully crafting a framework that identifies bad actors, describes their discriminatory actions, and outlines solutions to them, the FCC just assumes everyone is guilty. The regulators will treat any entity that tries to build up the next generation of internet access as purveyors of systemic injustice. The agency’s order does not give tangible examples of violations but operates under the premise that it should punish all broadband internet companies.
The regime applies to every company in the broadband internet space. It even applies to the small business contractors who build and maintain the infrastructure. These operators and technicians simply build where governments have permitted them to construct cell towers or lay fiber. But if the FCC deems that their work promotes discrimination, then bureaucrats will investigate and punish the workers on the frontlines.
This bureaucracy will hamstring the entire internet ecosystem. The rules will hinder industry leaders from developing and deploying new technologies that could transform internet access. Companies might fear that the FCC will interpret their best efforts as discrimination if all communities do not have an “equitable” opportunity to adopt the innovations. //
For example, in Pennsylvania, the population of Amish residents in Lancaster County is more than 39,000. As we learned from last month’s emergency alert test, there are quite a few Amish individuals who enjoy digital connectivity. If the FCC does not think enough Amish people subscribe to cell phone plans or use Wi-Fi in their barns, then the agency has granted itself the authority to investigate the supposed shortcoming as a violation of the digital discrimination order. Think of the absurdity: The FCC could actually punish a provider for not selling cellphones to enough Amish people. With this new regime, it is clear there are no limits to what the FCC will consider a breach. //
Instead of an arbitrary and undefined regime, the FCC would better serve the nation by establishing a framework that encourages cities and municipalities to promote the deployment of next-generation internet access. Too many communities, such as New York City, are dragging their feet. Others, like San Jose, are delaying deployment by charging internet providers exorbitant fees to build out these transformative networks.
It is clear the FCC’s rules are not concerned with improving internet access and upward mobility. Instead, they’re intended to dramatically expand the federal government’s power. There are real challenges to closing the digital divide, but the new order will not help that effort.
Now is the time to empower creators and innovators who are bringing new ideas to life. The FCC should work to promote new opportunities and technologies that will enable upward mobility rather than create a regime that punishes entrepreneurs who dare to take chances.
See this page fetch itself, byte by byte, over TLS
- This page performs a live, annotated https: request for its own source. It’s inspired by The Illustrated TLS 1.3 Connection and Julia Evans’ toy TLS 1.3.
- It’s built on subtls, a pure-JS TLS 1.3 implementation that depends only on SubtleCrypto. Raw TCP traffic is carried via a serverless WebSocket proxy.
The history of TCP congestion control is long enough to fill a book (and we did) but the work done in Berkeley, California, from 1986 to 1998 casts a long shadow, with Jacobson’s 1988 SIGCOMM paper ranking among the most cited networking papers of all time.
Slow-start, AIMD (additive increase, multiplicative decrease), RTT estimation, and the use of packet loss as a congestion signal were all in that paper, laying the groundwork for the following decades of congestion control research. One reason for that paper's influence, I believe, is that the foundation it laid was solid, while it left plenty of room for future improvements–as we see in the continued efforts to improve congestion control today.
And the problem is fundamentally hard: we’re trying to get millions of end-systems that have no direct contact with each other to cooperatively share the bandwidth of bottleneck links in some moderately fair way using only the information that can be gleaned by sending packets into the network and observing when and whether they reach their destination. //
It seems clear that there is no such thing as the perfect congestion control approach, which is why we continue to see new papers on the topic 35 years after Jacobson’s. But the internet's architecture has fostered the environment in which effective solutions can be tested and deployed to achieve distributed management of shared resources.
In my view that’s a great testament to the quality of that architecture. ®