488 private links
Google says you can't turn off AI overviews in the main search engine. I'm still seeing the "Labs" icon in the top right, with some checkboxes for AI features, but those checkboxes are no longer respected—some queries will bring up an AI overview no matter what. What you can do is go find a new "Web" filter, which can live alongside the usual filters like "Videos," "Images," "Maps," and "Shopping." That's right, a "Web" filter for what used to be a web search engine. Google says the Web filter can appear in the main tab bar depending on the query (when would a web filter not be appropriate?), but I've only ever seen it buried deep in the "More" section.
Once you do find the Web filter, the results will look like old-school Google. You get 10 blue links, and that's it, with everything else (Google Maps, answer info boxes, etc) disabled. Sadly, unlike old-school Google, these are still the current Google web results, so they'll be dominated by SEO sites rather than page quality.
Google says AI Overviews are rolling out to "hundreds of millions of users" this week, with "over a billion people" seeing the feature by the end of the year, as Google expands AI Overview to more countries. //
The power-user way to use Google Search web now takes a lot of clicks. You'd want to click on "more" and then "Web" for actual web results, and then to get Google to actually pay attention to the words you type in, you'd want to click "Tools" and change "all results" to "verbatim." Alternatively, you could also find a more web-focused search engine instead of Google.
The attack works by manipulating the DHCP server that allocates IP addresses to devices trying to connect to the local network. A setting known as option 121 allows the DHCP server to override default routing rules that send VPN traffic through a local IP address that initiates the encrypted tunnel. By using option 121 to route VPN traffic through the DHCP server, the attack diverts the data to the DHCP server itself. //
We use DHCP option 121 to set a route on the VPN user’s routing table. The route we set is arbitrary and we can also set multiple routes if needed. By pushing routes that are more specific than a /0 CIDR range that most VPNs use, we can make routing rules that have a higher priority than the routes for the virtual interface the VPN creates. We can set multiple /1 routes to recreate the 0.0.0.0/0 all traffic rule set by most VPNs. //
Interestingly, Android is the only operating system that fully immunizes VPN apps from the attack because it doesn't implement option 121. For all other OSes, there are no complete fixes. When apps run on Linux there’s a setting that minimizes the effects, but even then TunnelVision can be used to exploit a side channel that can be used to de-anonymize destination traffic and perform targeted denial-of-service attacks. //
The most effective fixes are to run the VPN inside of a virtual machine whose network adapter isn’t in bridged mode or to connect the VPN to the Internet through the Wi-Fi network of a cellular device.
Carr, who spoke for more than half an hour, described how the FCC's net neutrality decisions were allegedly swayed by President Obama in 2015 and by President Biden this year. "The FCC has never been able to come up with a credible reason or policy rationale for Title II. It is all just shifting sands, and that is because the agency is doing what it's been told to do by the executive branch," Carr said. //
"Congress never passed a law saying the Internet should be heavily regulated like a utility, nor did it pass one giving the FCC the authority to make that determination. The executive branch pressured the agency into claiming a power that remained, and remains, with the legislative branch," Carr said.
Twenty years ago, in a world dominated by dial-up connections and a fledgling World Wide Web, a group of New Zealand friends embarked on a journey. Their mission? To bring to life a Matrix fan film shot on a shoestring budget. The result was The Fanimatrix, a 16-minute amateur film just popular enough to have its own Wikipedia page.
As reported by TorrentFreak, the humble film would unknowingly become a crucial part of torrent history. It now stands as the world’s oldest active torrent, with an uptime now spanning a full 20 years.
Check DNS, Urls + Redirects, Certificates and Content of your Website
Société Internationale de Télécommunications Aéronautiques
(SITA) Neuilly France
INTRODUCTION
1.1. SITA (Société Internationale de Télécommunications Aéronautique), a cooperative company founded in 1949, embraces the majority of the international air carriers (more than 160). It provides to its members a worldwide message switching network.
1.2. Initially the network consisted of manual (torn-tape) centres, interconnected by low speed circuits (50, 75 Bauds, 60, 30, 15 words per minute, asynchronous). The Airline terminal equipment (teleprinters. Telex) was connected to the SITA manual centres, thus enabling airline messages to be exchanged via nodes of the SITA network, with consequent reduction in costs to the airlines by their sharing of communications facilities.
1.3. With the rapid development of the Air Transport Industry, the airline communications needs became increasingly important and thus the SITA network expanded very quickly, by 1963 covering the world. Network development was not, however, restricted to geographic extension; in 1963 a number of the busiest manual centres were replaced by semi-automatic systems, and three years later, due to the continuing steady increase of traffic volumes, SITA equipped the Frankfurt centre with its first computer system to perform the message switching functions. Then, in 1969, SITA began replacing the other most heavily loaded centres (Western Europe and New York) with computer systems and established a computer communication data network by interconnecting these centres with voice grade circuits (medium speed). This network, called the High Level Network, performing the task of block switching, was interfaced at that time with the rest of the network composed of manual centres. This step was soon followed by the automation of other manual centres using what are in SITA terminology called satellite processors. These stand-alone computers act as concentrators of airline teleprinter traffic and controllers of airline CRT terminals, each of them connected to one High Level Centre by medium speed circuits. By mid-1973, the SITA network comprised 150 centres including 8 high level centres and 21 satellite processors. The 29 automated centres will be referred to as the SITA medium speed network (see figure 1).
IPFS is just a technology, not a predatory financial gambit. It is a set of peer-to-peer protocols for finding content on a decentralized network. It relies on a Content Identifier (CID) rather than a location (URL), which means the focus is on the identity of the content (a hash) rather than a server where it's stored.
IPFS focuses on representing and addressing data, routing it, and transferring it. It's not a storage service, though storage is necessary to use it. It's been adopted by Cloudflare and implemented in Brave and Opera, among others, and work is being done to make it work in Chromium.
IPFS traffic is public unless encrypted, which is why there are rival decentralized projects that strive for stronger built-in privacy protection like Veilid.
Parts of Africa were already seeing web disruptions from damaged Red Sea cables.
Our pledge
We'll never raise money. We'll never get acquired. We'll never shut down.
In fact, we don't even take salaries. All proceeds go directly to sustaining Posthaven for the next 100+ years.
Forever hosting
Pay for at least a year and your site stays online.
If we can't charge your card, your site goes into read only mode. Even if something catastrophic happens, your content will remain online.
a plan designed exclusively for those seeking the ultimate in security and longevity for their digital presence.
Safeguard your online legacy with the 100-Year Plan. This brand-new offering is for:
- Families who wish to preserve their digital assets—the stories, photos, sounds, and videos that make up their rich family history—for generations to come.
- Founders who want to protect and document their company’s past, present, and future.
- Individuals seeking a stable, flexible, and customized online home that can adapt to whatever changes the future of technology will bring. //
The 100-Year Plan isn’t just about today. It’s an investment in tomorrow. Whether you’re cementing your own digital legacy or gifting 100 years of a trusted platform to a loved one, this plan is a testament to the future’s boundless potential.
The cost is $38,000. We hope people renew. If you’re interested in learning more, fill out the form found here:
Dealing with your digital legacy after your death is a big issue, and one that requires a lot of thought and a lot of problems to be solved, so let’s break it down into smaller pieces and think about them individually. This post is primarily a collection of thoughts about dealing with the problem from the domains side, not hosting. Hosting is a problem for more posts.
The internet isn’t that old and most of the pioneers are still around. But we can see the wave coming, so let’s try to solve this problem before it breaks.
With this online TCP port scanner you can scan an IP address for open ports.
Reliable and free network scanner to analyze LAN. The program shows all network devices, gives you access to shared folders, provides remote control of computers (via RDP and Radmin), and can even remotely switch computers off. It is easy to use and runs as a portable edition. It should be the first choice for every network admin.
Despite having a population of just 1,400, until recently, Tokelau’s .tk domain had more users than any other country. Here’s why.
Douglas Engelbart changed computer history forever on December 9, 1968.
A half century ago, computer history took a giant leap when Douglas Engelbart—then a mid-career 43-year-old engineer at Stanford Research Institute in the heart of Silicon Valley—gave what has come to be known as the "mother of all demos."
On December 9, 1968 at a computer conference in San Francisco, Engelbart showed off the first inklings of numerous technologies that we all now take for granted: video conferencing, a modern desktop-style user interface, word processing, hypertext, the mouse, collaborative editing, among many others.
Even before his famous demonstration, Engelbart outlined his vision of the future more than a half-century ago in his historic 1962 paper, "Augmenting Human Intellect: A Conceptual Framework."
To open the 90-minute-long presentation, Engelbart posited a question that almost seems trivial to us in the early 21st century: "If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsible—responsive—to every action you had, how much value would you derive from that?"
Of course at that time, computers were vast behemoths that were light-years away from the pocket-sized devices that have practically become an extension of ourselves.
Engelbart, who passed away in 2013, was inspired by a now-legendary essay published in 1945 by Vannevar Bush, physicist who had been in charge of the United States Office of Scientific Research and Development during World War II.
That essay, "As We May Think," speculated on a "future device for individual use, which is a sort of mechanized private file and library." It was this essay that stuck with a young Engelbart—then a Navy technician stationed in the Philippines—for more than two decades.
By 1968, Engelbart had created what he called the "oN-Line System," or NLS, a proto-Intranet. The ARPANET, the predecessor to the Internet itself, would not be established until late the following year.
Five years later, in 1973, Xerox debuted the Alto, considered to be the first modern personal computer. That, in turn served as the inspiration for both the Macintosh and Microsoft Windows, and the rest, clearly, is history.
Oct 29, 2019
Marlor_AU Ars Tribunus Angusticlavius 20y 6,800
Rector said:
LordEOD said:
Did Al Gore Say ‘I Invented the Internet’?"Despite the multitudinous derisive references to the supposed quote that continue to be proffered even today, former U.S. vice president Al Gore never claimed that he “invented” the Internet, nor did he say anything that could reasonably be interpreted that way."
Al Gore never said that he invented the internet. He said he created the internet. See direct quote above.
Gore was the driving force behind legislation that opened up various agency-specific networks and united them to create the precursor to today’s internet. Without this legislation, the siloed networks would have continued developing in parallel. From Kahn and Cerf (linked above):
“As far back as the 1970s Congressman Gore promoted the idea of high speed telecommunications as an engine for both economic growth and the improvement of our educational system. He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. Though easily forgotten, now, at the time this was an unproven and controversial concept. Our work on the Internet started in 1973 and was based on even earlier work that took place in the mid-late 1960s. But the Internet, as we know it today, was not deployed until 1983. When the Internet was still in the early stages of its deployment, Congressman Gore provided intellectual leadership by helping create the vision of the potential benefits of high speed computing and communication. As an example, he sponsored hearings on how advanced technologies might be put to use in areas like coordinating the response of government agencies to natural disasters and other crises.
As a Senator in the 1980s Gore urged government agencies to consolidate what at the time were several dozen different and unconnected networks into an "Interagency Network." Working in a bi-partisan manner with officials in Ronald Reagan and George Bush's administrations, Gore secured the passage of the High Performance Computing and Communications Act in 1991. This "Gore Act" supported the National Research and Education Network (NREN) initiative that became one of the major vehicles for the spread of the Internet beyond the field of computer science.”
If the guys who created the protocols that drive the internet are happy to credit Gore as the administrative driving force behind it, then who can argue with that?
Nathan2055 Ars Centurion 7y 355 Subscriptor
mknelson said:
tjukken said:
"Without ARPANET, there would have been no Internet"I doubt that is true.
The internet is more than just wires between computers (or Tubes if you're from Alaska). It's the protocols that make it all work.
The origins of those protocols and the hardware they work on in many ways have their origins in the early work on ARPANET
This is exactly right.
ARPANET was, indisputably, the first network to implement TCP/IP, which eventually became the backbone protocols of the modern Internet. Now, ARPANET did not originally launch with TCP/IP; it originally used the far more simplistic Network Control Program, which had severe limitations and was not standardized outside of ARPANET. The need for a single, standardized protocol for sharing network resources that could be utilized by any computer system led to the development of the Transmission Control Program in 1974. Incidentally, the RFC for this system, RFC 675, is notable as it contains the first use of the word Internet, intended as a shorthand for "internetworking."
Transmission Control Program would later be split off into the Transmission Control Protocol and the Internet Protocol, separating the transport layer and network layer for increased modularity. TCP/IP was declared the new standard for computer networking by the DoD in March 1982, and shortly after University College London did the same; and the entire network was rebooted and switched over on January 1, 1983. This is another candidate for the Internet's birthday, along with March 12, 1989 (Tim Berners-Lee submitting his original proposal for CERN to adopt an information management system based on "hypertext") , December 20, 1990 (the first hypertext web page using HTML and HTTP was published, www.cern.ch, describing the World Wide Web project), and January 1, 1990 (the first fully commercial Internet backbone, not controlled or limited in anyway by government or educational institutions, was switched on by PSInet).
That being said, I'd certainly argue that the first long-distance computer network connection, even if it wasn't using TCP/IP yet, makes the most sense to celebrate as the Internet's birthday.
RoninX Ars Tribunus Militum 12y 2,849 Subscriptor
Dibbit said:
tjukken said:
"Without ARPANET, there would have been no Internet"I doubt that is true.
I kinda concur.
While Arpanet was very early and important, the way it is framed always brings to mind the image that everything grew out of this original network.The truth is more that individual networks grew bigger and bigger and linked up. For instance, The Swiss network was already interconnected before hooking up to the "main node" so to speak.
If Arpanet hadn't been there, there would've been another big network that would've been appointed "the origin"
There would undoubtedly have been some sort of global computer network without the ARPANET, but it might have taken a very different form.
It could have evolved from something like Compuserve into a more cable-TV model, with a sharp distinction between computers that could serve information (which would need to be approved by the network supplier) and those that just accessed that information. So, like Comcast, except you would need Comcast's approval for any information you wanted to post to the network, and undoubtedly have to pay an additional fee.
Or it could have evolved from the heavily-regulated world of amateur radio, where some hobbyists were experimenting with packet radio for teletype communication -- where every user needs to have a license, and things like profanity are strictly prohibited.
Or it could have become a government bureaucracy, like the Post Office or the DMV, where the service is paid for by a combination of taxes and user fees, and all use is both licensed and tracked to individual users.
Or it could have grown out of Fidonet into an even more distributed model, where all of the networking was peer-to-peer, and evolving into a network that would have been like torrents on steroids.
Or it could have been built and owned by Microsoft and have only supported Windows PCs.
Or any one of a dozen other possibilities.
Computer networks were inevitable, but the fact that the Internet works the way it currently does -- for better or worse -- is directly a result of the architecture pioneered by the ARPANET.
The precursor to the Internet carried its first login request on October 29, 1969. //
On October 29, 1969, at 10:30pm Pacific Time, the first two letters were transmitted over ARPANET. And then it crashed. About an hour later, after some debugging, the first actual remote connection between two computers was established over what would someday evolve into the modern Internet.
Funded by the Advanced Research Projects Agency (the predecessor of DARPA), ARPANET was built to explore technologies related to building a military command-and-control network that could survive a nuclear attack. But as Charles Herzfeld, the ARPA director who would oversee most of the initial work to build ARPANET put it:
The ARPANET was not started to create a Command and Control System that would survive a nuclear attack, as many now claim. To build such a system was, clearly, a major military need, but it was not ARPA's mission to do this; in fact, we would have been severely criticized had we tried. Rather, the ARPANET came out of our frustration that there were only a limited number of large, powerful research computers in the country, and that many research investigators, who should have access to them, were geographically separated from them. //
The first letters transmitted, sent from UCLA to Stanford by UCLA student programmer Charley Kline, were "l" and "o." On the second attempt, the full message text, login, went through from the Sigma 7 to the 940. So, the first three characters ever transmitted over the precursor to the Internet were L, O, and L. //
When it was shut down, Vinton Cerf, one of the fathers of the modern Internet, wrote a poem in ARPANET's honor:
It was the first, and being first, was best,
but now we lay it down to ever rest.
Now pause with me a moment, shed some tears.
For auld lang syne, for love, for years and years
of faithful service, duty done, I weep.
Lay down thy packet, now, O friend, and sleep.
On Thursday, Internet pioneer Vint Cerf announced that Dr. David L. Mills, the inventor of Network Time Protocol (NTP), died peacefully at age 85 on January 17, 2024. The announcement came in a post on the Internet Society mailing list after Cerf was informed of David's death by Mills' daughter, Leigh.
"He was such an iconic element of the early Internet," wrote Cerf.
Dr. Mills created the Network Time Protocol (NTP) in 1985 to address a crucial challenge in the online world: the synchronization of time across different computer systems and networks. In a digital environment where computers and servers are located all over the world, each with its own internal clock, there's a significant need for a standardized and accurate timekeeping system.
NTP provides the solution by allowing clocks of computers over a network to synchronize to a common time source. This synchronization is vital for everything from data integrity to network security. For example, NTP keeps network financial transaction timestamps accurate, and it ensures accurate and synchronized timestamps for logging and monitoring network activities.
In the 1970s, during his tenure at COMSAT and involvement with ARPANET (the precursor to the Internet), Mills first identified the need for synchronized time across computer networks. His solution aligned computers to within tens of milliseconds. NTP now operates on billions of devices worldwide, coordinating time across every continent, and has become a cornerstone of modern digital infrastructure.