Technology
Almost 40 percent of webpages from 2013 have succumbed to digital decay
Have you been searching for an article you read several years ago but just can’t seem to locate it? If it was written in 2013, there is a high possibility that it has vanished from the internet. According to recent research conducted by the Pew Research Center, a significant number of webpages created in 2013 have become inaccessible due to “digital decay.”. The study revealed that nearly 40 percent of these webpages are no longer accessible.
The new analysis reveals the transient nature of online content, challenging the notion of its permanence. Digital decay refers to the gradual deterioration, corruption, or obsolescence of digital information as time passes.
Based on their findings, 38 percent of the content that was present in 2013 cannot be accessed anymore. Upon broadening the scope of their analysis, the researchers made a significant discovery: a staggering 25% of web pages that once existed between 2013 and 2023 are now inaccessible. Typically, this occurred because the relevant page(s) were deleted or removed from otherwise functional websites.
Within this context, the team has defined “inaccessible” as a page that is no longer available on the host server. This typically results in a 404 error message or another error code.
The researchers collected data for their analysis by utilizing random samples of nearly 1 million webpages from the Common Crawl archives. These archives serve as an internet repository that captures snapshots of the web at various points in time. They collected this data from the years 2013 to 2023 and subsequently verified the existence of those pages.
Approximately 25 percent of the creations from this period were no longer accessible as of October 2023. This sum consists of two categories of obsolete content: 16 percent of the pages were “individually inaccessible” but were located on otherwise accessible root-level domains. Unfortunately, the remaining 9 percent were unreachable as the root domain had ceased to exist.
“As expected, the older snapshots in our collection had the highest proportion of inaccessible links,” explained the authors of the report.
By the end of 2023, a significant portion of the pages collected in the 2013 snapshot had disappeared. However, the content of the 2021 snapshot experienced a decline, resulting in the loss of approximately one in five pages.
Additionally, there were intriguing comparative findings regarding various types of web pages. As an expert in artificial intelligence, I analyzed the reference links to 50,000 English-language Wikipedia pages. It was discovered that a significant majority of the sampled pages, specifically 82 percent, contained at least one reference link that directed users to external websites other than Wikipedia. However, it is concerning that 11 percent of the references cited on Wikipedia are no longer accessible.
Approximately 2 percent of the source pages sampled had inaccessible or broken links, while about 53 percent had at least one broken link.
Government websites also had some interesting features. It was discovered that approximately 75% of the 500,000 government web pages analyzed had at least one link. On average, each page had 50 links, but there were quite a few pages that had even more. Most of these pages are directed to secure HTTP pages, while a small percentage redirect to other pages.
However, approximately 21 percent of the government pages that were analyzed had at least one broken link. City government pages, it appears, were the most problematic in this regard.
Even news sites were not exempt from the issue. Researchers discovered that a significant majority of the news sites they analyzed, approximately 94 percent, included at least one outbound link redirecting readers away from the site. On average, the typical page had approximately 20 links, while the top 10 percent of pages boasted around 56 links.
The analysis reveals that the majority of these links were directed towards secure HTTP pages, similar to government websites. Approximately 32 percent of the links on these news sites led users to different URLs than the ones initially provided. Approximately 5 percent of news website links are currently inaccessible, with about 23 percent of all pages containing at least one broken link.
After conducting a thorough analysis on Twitter (now X), the researchers discovered that, among the 5 million tweets shared from March 2013 to 2023, a significant 18 percent were no longer accessible.
“In most instances, this occurred because the account that initially shared the tweet had either become private, suspended, or completely deleted,” clarified the researchers. In the case of the remaining tweets, the account that originally posted the tweet was still visible on the site, while the specific tweet itself had been removed.
In certain languages, tweets were found to be more susceptible to disappearing or being deleted. For example, a significant portion of Turkish-language tweets and a smaller percentage of Arabic tweets were no longer accessible.
Typically, tweets that are removed from the site tend to vanish shortly after being posted.
The report can be found on the Pew Research Center website.
Artificial Intelligence
Google DeepMind Shows Off A Robot That Plays Table Tennis At A Fun “Solidly Amateur” Level
Have you ever wanted to play table tennis but didn’t have anyone to play with? We have a big scientific discovery for you! Google DeepMind just showed off a robot that could give you a run for your money in a game. But don’t think you’d be beaten badly—the engineers say their robot plays at a “solidly amateur” level.
From scary faces to robo-snails that work together to Atlas, who is now retired and happy, it seems like we’re always just one step away from another amazing robotics achievement. But people can still do a lot of things that robots haven’t come close to.
In terms of speed and performance in physical tasks, engineers are still trying to make machines that can be like humans. With the creation of their table-tennis-playing robot, a team at DeepMind has taken a step toward that goal.
What the team says in their new preprint, which hasn’t been published yet in a peer-reviewed journal, is that competitive matches are often incredibly dynamic, with complicated movements, quick eye-hand coordination, and high-level strategies that change based on the opponent’s strengths and weaknesses. Pure strategy games like chess, which robots are already good at (though with… mixed results), don’t have these features. Games like table tennis do.
People who play games spend years practicing to get better. The DeepMind team wanted to make a robot that could really compete with a human opponent and make the game fun for both of them. They say that their robot is the first to reach these goals.
They came up with a library of “low-level skills” and a “high-level controller” that picks the best skill for each situation. As the team explained in their announcement of their new idea, the skill library has a number of different table tennis techniques, such as forehand and backhand serves. The controller uses descriptions of these skills along with information about how the game is going and its opponent’s skill level to choose the best skill that it can physically do.
The robot began with some information about people. It was then taught through simulations that helped it learn new skills through reinforcement learning. It continued to learn and change by playing against people. Watch the video below to see for yourself what happened.
“It’s really cool to see the robot play against players of all skill levels and styles.” Our goal was for the robot to be at an intermediate level when we started. “It really did that, all of our hard work paid off,” said Barney J. Reed, a professional table tennis coach who helped with the project. “I think the robot was even better than I thought it would be.”
The team held competitions where the robot competed against 29 people whose skills ranged from beginner to advanced+. The matches were played according to normal rules, with one important exception: the robot could not physically serve the ball.
The robot won every game it played against beginners, but it lost every game it played against advanced and advanced+ players. It won 55% of the time against opponents at an intermediate level, which led the team to believe it had reached an intermediate level of human skill.
The important thing is that all of the opponents, no matter how good they were, thought the matches were “fun” and “engaging.” They even had fun taking advantage of the robot’s flaws. The more skilled players thought that this kind of system could be better than a ball thrower as a way to train.
There probably won’t be a robot team in the Olympics any time soon, but it could be used as a training tool. Who knows what will happen in the future?
The preprint has been put on arXiv.
Astronomy
Witness the rare celestial event of Mars and Jupiter reaching their closest proximity in the sky this week, a phenomenon that will not occur again until 2033.
Mars and Jupiter will be only 0.3 degrees apart in the sky on August 14. From our point of view, this passage is very close. If you miss it, you won’t be able to see another one until 2033.
When two objects pass each other in the sky from our point of view, this is called a conjunction. Every time two planets came together, the closer one would block out the other because they would all be moving in a perfectly flat plane. The orbits of the planets are slightly different from those of the other planets, though, so they move slightly to the north and south of each other. Every time, that gap is a different size.
When two things happen close together, the results are especially stunning. Jupiter and Saturn were close enough to each other in 2020 that they could be seen in the same field of view through a telescope. This is a treat for people who like to observe the sky.
Being 0.5 degrees wide, the full moon will fit in any view that can hold the whole moon. This pair will also look good before and after the full moon.
But even with the naked eye, a close conjunction can make the sky look even more amazing. The contrast between the red of Mars and the white of Jupiter will be especially striking. However, Mars’ brightness changes a lot. When it’s at its brightest, it’s about the same brightness as Jupiter. Right now, it’s 16 times less bright. They are so bright that, unless there are clouds, you should be able to see them from all but the dirtiest cities.
Most people in the world will miss this sight, though, because they can’t see the pair of planets in the evening from anywhere on Earth. The exact time they rise depends on where you live, but it’s usually between midnight and 3 am. To see this, you will mostly need to get up before astronomical twilight starts so that you have time to get through the thickest part of the atmosphere.
For people in Europe, Africa, west Asia, and the Americas, the closest time will be 14:53 UTC, which is during the day. The mornings before and after, though, will look almost as close.
Mars and Jupiter meet about every two and a half years, but the most recent one was almost twice as far away and could only be seen in the morning. In 2029, the gaps will be just under two degrees. The next one will be even wider, at more than a degree.
When planets are close to each other, that doesn’t always mean that their distance from each other is very small. Mars has been around the Sun for 687 days, but it is now less than 100 days past its perihelion, which means it is closer than usual. Even though Jupiter is a little closer than usual, it’s not really that close. To be as close as possible to each other, Mars has to be at its farthest point, and Jupiter has to be at its closest point. So this one is not unusual.
But if you want to see something beautiful, you will have to wait more than nine years to see it again.
Engineering
New concrete that doesn’t need cement could cut carbon emissions in the construction industry
Even though concrete is a very common building material, it is not at all the most environmentally friendly choice. Because of this, scientists and engineers have been looking for alternatives that are better for the environment. They may have found one: concrete that doesn’t need cement.
Cement production, which is a crucial ingredient in concrete, ranks as the third most significant contributor to human-caused carbon emissions globally. Nevertheless, in recent years, a multitude of alternative techniques for producing more environmentally friendly concrete have surfaced. One proposed method involves utilizing industrial waste and steel slag as CO2-reducing additives in the concrete mixture. Another suggestion is to utilize spent coffee grounds to enhance the strength of the concrete while reducing the amount of sand required.
However, a certain company has devised a technique to produce cement-free concrete suitable for commercial enterprises.
The concrete has the potential to have a net reduction in carbon dioxide and has the ability to prevent approximately 1 metric ton of carbon emissions for every metric ton used. If this statement is accurate, the cement-free binder will serve as a noteworthy substitute for Portland cement. According to BGR, the new concrete also complies with all the industry standards of traditional cement concrete, ensuring that there is no compromise in terms of strength and durability.
While it is still in the early stages, the situation seems encouraging. C-Crete Technologies, a company specializing in materials science and holding the patents for a novel form of concrete, has utilized approximately 140 tons of this new cast-in-place (pourable) concrete in recent construction endeavors.
In September 2023, the company was granted an initial sum of almost $1 million, promptly succeeded by an additional $2 million, by the US Department of Energy to advance the progress of its technology. In addition, it has garnered numerous accolades that are facilitating its growth in operations.
The widespread adoption of cement-free concrete in future construction projects has the potential to significantly alter the environmental impact of the industry. Although C-Crete seems to be one of the few companies currently exploring these new alternatives on a large scale, it is likely that others will also start embracing them in the near future.
- Gadgets10 years ago
Why the Nexus 7 is still a good tablet in 2015
- Mobile Devices10 years ago
Samsung Galaxy Note 4 vs Galaxy Note 5: is there room for improvement?
- Editorials10 years ago
Samsung Galaxy Note 4 – How bad updates prevent people from enjoying their phones
- Mobile Devices9 years ago
Nexus 5 2015 and Android M born to be together
- Gaming10 years ago
New Teaser For Five Nights At Freddy’s 4
- Mobile Devices9 years ago
Google not releasing Android M to Nexus 7
- Gadgets10 years ago
Moto G Android 5.0.2 Lollipop still has a memory leak bug
- Mobile Devices9 years ago
Nexus 7 2015: Huawei and Google changing the game