You are reading the article Can Pagerank Predict The Nhl Playoffs? updated in November 2023 on the website Minhminhbmm.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested December 2023 Can Pagerank Predict The Nhl Playoffs?
Today I want to take a break from SEO to talk about sports – specifically hockey – in terms of SEO concepts.
The 2023 NHL playoffs are (finally) underway.
So in the spirit of debate, let’s find out if it’s possible to measure sports using SEO techniques – and ultimately predict the next NHL Stanley Cup champion.
Measuring Sports Using SEO TechniquesThese days, all the cool SEO tools like Screaming Frog and Sitebulb include features they call a crawl map.
They’re actually a pretty cool diagnostic tool you can use for finding all kinds of SEO issues.
Here’s an example from WTFSEO so you can see what I’m talking about (and also so I can sneak in a link to).
Anyway, we were doing some cool work with force-directed graphs at work the other day and it got me thinking of an old experiment I ran last year.
While discussing PageRank’s history of originally being used to measure citations and then being adapted to the web, I wondered: “What else could PageRank be adapted to do?”
Historically, PageRank has been really, really good at measuring the authority of webpages.
My theory was, can it measure the authority of other things too?
What about using regular season sports data to see if we can find the most authoritative team in the NHL?
Wins and losses can be skewed because some team gets lucky and plays my hometown Detroit Red Wings several times per season while other teams only got to play them once.
So instead of just looking at total wins, what if we could look at the quality of wins?
Looking at the Quality of WinsIf we have a crawl map, we should be able to calculate PageRank.
Several of us old school SEO professionals have been creating these crawl maps manually for years.
I used to love doing them in Google Fusion Tables but then Google took that tool away from us (somebody please bring a version of this back, pretty please?).
Before fusion tables, we used a free tool called Gephi.
Gephi has a bit of a learning curve and there’s a ton of posts out there on other SEO sites talking about how to use it for SEO – but that’s not the scope of this post.
I first started looking at sports victories as directed graphs a couple of years ago.
Below is a tweet where I did this with NCAA college football, and as you can see it worked out really well.
(Actually, eigenvectors and harmonic centrality models worked better than PageRank. But I suspect modern PageRank is way different now in Google than it was in the original algorithm. And besides, if we start using eigenvectors then we lose the SEO tie in and I have to find someplace else to write this post.)
— Ryan Jones (@RyanJones) January 31, 2023
So, what about hockey?
Well, last year PageRank successfully predicted that the St. Louis Blues would win the Stanley Cup.
— Ryan Jones (@RyanJones) April 22, 2023
If you check out the above thread it shows that STL, PIT, and BOS were the best 3 teams according to PageRank, and it just so happens that STL beat BOS for the Stanley Cup.
We might be onto something here.
Calculating the PageRank of NHL Regular Season DataSo, with the NHL season about to start up again, let’s apply this to NHL regular season data.
Below is the force-directed graph of every NHL game represented like such:
Every team links to all the teams that beat them.
It’s kind of pretty once we add some color.
Side note: It’s amazing how many teams are red/blue. Here’s hoping Seattle picks a color scheme that isn’t the same as every other team.
Now, all we have to do is run PageRank.
You can see the results of that below (as well as some other stats for you data geeks):
I’ve already gone ahead and bet 10 units on every first-round playoff matchup based on the team with the higher score here.
I’ve also put some money on the Colorado Avalanche to win the Cup.
As a Detroit fan, this kills me, but Go Avs!
Who’s your team?
Do you think this model will come close to reality or will COVID-19 ultimately make the statistics unreliable?
More Resources:
Image Credits
All screenshots taken by author, August 2023
You're reading Can Pagerank Predict The Nhl Playoffs?
How To Predict The Election Result With A Coin?
This article was published as a part of the Data Science Blogathon.
Photo by Clay Banks on Unsplash
vents and happenings has been his ability has been so attractive to many that they have selected the career of “prophecy” for themselves, attracting many followers and revenue in the process. Some of those people’s names, such as “Nostradamus,” are still heard, and some people are still looking for the realization or non-realization of his predictions after centuries.
Photo by Jen Theodore on Unsplash
Forecasting what will happen in the future is considered a critical business skill. In most cases, a company’s level ght and understanding about the future is directly linked to its level of validity, success, and the rate at which it moves through the stages of growth. So, a company makes plans based on what will happen in the future; due to this, most businesses today pay more attention to data science and spend a lot of money to make their businesses more data-driven. Many companies also employ many data scientists and set up departments for data science.
It is evident that today’s forecasts, developed with data science, are significantly different from those made centuries ago. Forecasting and getting insights with the help of data science is a scientific process based on the analysis and a close look at a data set.
Let’s leave the business world and move to the more yellow world of elections. Guess who will win an election is one of the most interesting things people do. Every year, before an election, several statistical and research agencies start collecting and analyzing data to determine who will win before voting. As a result, Politicians usually pay statistics organizations large sums of cash to produce real-time reports on how the public perceives them. This way, the candidate can use intelligence to lead his campaign.
Main Problem with Collecting Data
Most enterprises and institutions that collect data have a big problem when they try to predict elections: they have a lot of wrong information. This problem will lead to wrong conclusions in the end. This problem was one of the most obvious during the 2023 presidential election in the US. In that year, Donald Trump and Hillary Clinton were in the race. Most polls and reputable news sites at the time said Hillary would win the election easily.
Source: NYTimes
Most people believed this prediction because, on the one hand, there was Trump, who had no political experience and had campaigned for election by making very harsh and aggressive statements. On the other hand, Clinton has been involved in politics for more than 30 years and has served as Secretary of State. With all of these explanations, it didn’t seem so unusual that Clinton beat Trump. Many reliable news sources, like the New York Times, said that Clinton’s chances of beating Trump were 85 to 15.
The Whole World is Impressed!
The whole world was amazed as the votes were being counted. Even the most optimistic statistical groups didn’t think Trump had a chance, but he won the electoral votes of key states one by one and quickly got the 270 electoral votes he needed to become president.
At the end of the vote, Trump won with 306 votes to Clinton’s 232. This was one of the most surprising results in US electoral history. This failure shocked a lot of data scientists. Many papers have been written up to this point, and many different parts of this situation have been examined.
Photo by Library of Congress on Unsplash
One of the main reasons why many pre-election polls give different results is that people don’t say who their favorite candidate is for various reasons. So, they either didn’t vote or said they would vote for someone else. However, on election day, when they went to the polls, they voted for their ideal candidate, who wouldn’t say his name. If most people do this, it’s clear that the poll results will change, and a statistical disaster like what happened in 2023 will happen again.
A Summary of What the Problem is!
In this article, we are not going to find out why the predictions for the 2023 elections were completely mistaken. In the future, we might publish a more in-depth article about the statistical reasons and factors that led to this. In this article, we’ll learn about a game based on math and statistics.
As was already said, one of the most important factors in predicting mistakes is the amount of wrong information that gets into the survey. Now, if we can change how we collect data to make it much more accurate, our prediction of how the election will turn out will probably be right. We’ll look at examples of this type of data tracking in the following sections.
Imaginary Forest
In a faraway forest where 200 animals live, there is an election to choose the forest leader. “Ms. Tree” and “Mr. Pig” are the two candidates in this election. “Ms. Tree” has been teaching forest residents for a long time. It thinks everyone in the forest and neighboring forests should live peacefully and work together.
On the other hand, “Mr. Pig” is a violent person who loves fighting. He thinks that a lot of the resources should be put into the military and, if possible, used to attack neighboring forests and steal their resources. A secret group that works in the forest has given us the job of finding out who will win the election before it happens. Our lives are at risk if we are wrong about what will happen!
Photo by Sergei A on Unsplash
As a result, we decided to survey every forest member to predict the election’s true winner. When we ask people who their favorite candidate is, Based on the atmosphere in the forest, the following things will happen:
If a person wants to vote for “Ms. Tree,” he selects her as his preferred candidate.
If someone wants to vote for “Mr. Pig”, they probably will give us the name of the “Ms. Tree” (We don’t know how often this happens, but we do know that it does happen sometimes.)
As a result, the most important thing for us is to get accurate data from forest people. So, we should use a strategy that lets us find out people’s real favorite candidate without making them say it out loud. In other words, we need to find a middle ground between our language and theirs so that they can answer our questions in that middle-ground language and we can understand them. While we get accurate information from a person, his or her personal opinion is kept private and does not become public. But what should be done to solve this problem?
A Coin and an Endless Amount of Possibilities
Photo by Steve Smith on Unsplash
As a consequence, it’s enough to go to each of the forest’s residents and, by giving him a coin, ask him to play the following game in his home (without our presence) and tell us of the outcome; the rules of the game are as follows:
If the coin lands on “tails,” the coin is thrown, call out the name of the “Mr. Pig.”
As was already said, the biggest problem with tracking data about the election was that many people who were willing to vote for the “Mr. Pig” didn’t come out and say so for various reasons.
This strategy solves the problem since it is unknown if a person voted for “Mr. Pig” because of desire or because the coin “tails” appeared. People may safely tell us the outcome of the game since we don’t know the outcome of the tap or the line. The following was the outcome of the data collection:
Total number of votes: 200
Total Number of votes for “Mr. Pig”: 130
Total Number of votes for “Ms. Tree”: 70
We have, in fact:
The second rule of the game was that if the coin came up “tails,” you had to call out the name of the “Mr. Pig.” As a result, 100 individuals have announced his name without necessarily wanting to vote for “Mr. Pig.”
As a result, we must deduct 100 votes from the 130 total votes for “Mr. Pig” since there was no power in these 100 votes, and the people declared the name “Mr. Pig” only to comply with the game’s rules.
As a result of using this rule, we will have:
Total Number of votes for “Mr. Pig”: 30
Total Number of votes for “Ms. Tree”: 70
Due to the big difference in votes between “Ms. Tree” and “Mr. Pig,” it is safe to predict that “Ms. Tree” will win on election day. We got the information we needed from them and predicted the election outcomes using this approach without violating people’s privacy.
Some might question if there is a situation in which someone wants to vote for the “Mr. Pig,” but since the “tails” side of the coin has come, his vote is effectively ignored.
This question has these answers;
First, we are aiming to predict the winner of the election; it is not our objective to calculate how many votes each candidate will get, and we simply do not have the ability to do so properly.
Second, This situation works just as well the other way around. In that case, some people voted for “Ms. Tree,” but their votes were not counted because the “tail” side of the coin had come. So, these factors won’t have much of an effect on the outcome of our forecast and our prognosis is correct.
In the rest of this article, we’ll discuss how this approach can be used in product management.
Product Launches and Surveys
Your company just launched a new product. You are head of product management for this product, and after some time has passed, you create a survey with two answer choices to get more information and insight into user views.
To get more people to fill out the survey and fill it out, the company will give $10 to each person who responds to the survey. But if you give people this incentive, they might be more likely to choose options with positive meanings and avoid ones with negative ones.
But if people didn’t get these ten dollars, they might not participate as much and not read the questions correctly. So, a plan should be put in place that gets people more motivated and makes them more accurate. So, you decide as a product manager to make the following game.
As we’ve already said, each question’s answer has two choices: one with a positive concept load and the other with a negative concept load. Now, you put a “+” sign next to options with a positive conceptual load and a “-” sign next to options with a critical conceptual load. At the start of the survey, you ask the participant to hold a coin and answer the questions in this way;
Before reading the question, flip a coin.
If a “tails” coin is flipped, pick the positive answer (+) without reading the question.
You have therefore made it less likely that people will be forced to make a positive choice. You can now keep a closer eye on the survey results. For example, one of the questions that two thousand participants responded to was as follows:
Total participant count: 2000
1300 people voted for the choice with a good connotation
700 people voted for the choice with a negative and critical connotation.
We removed 1,000 from the 1,300 positive votes since, according to the guidelines, if the “tails” coin shows up, folks choose the positive choice (without even reading the question). We will now have;
300 people voted for the choice with a positive connotation.
700 people voted for the choice with a negative and critical connotation.
So, as a whole, we can say that directly opposed to what was shown at the beginning, most people had a negative perception of that question, and as product managers, we need to find out why.
So, by adding a financial incentive, the number of people who took the survey climbed. On the other hand, by making this simple game, the answers became much more accurate, which means that in the future, more realistic decisions can be made based on the survey results.
Conclusion
According to the topics we discussed, one of the most significant factors to consider while running a questionnaire and collecting data is the accuracy of the data. If the data isn’t collected correctly, it’s clear that its conclusions will be misleading.
Most of the time, the following things cause survey questions to get the wrong answers:
The questions are about private things, and the person doesn’t feel safe answering them.
One of the alternatives doesn’t have a good reputation, which makes people not want to choose it because they don’t want to be judged.
Putting a person in a situation where he has to cho n two options
In general, it is the job of the people who creates the survey and the person who collects the data to detect such situations. To solve the problems, the following steps should be taken:
Creating a language between the person asking the question and the person answering it so that the person answering can confidently choose the right answer.
Minimizing the area for judgment and providing a secure atmosphere in which the responder may choose the preferred alternative.
By taking the above steps, the quality of data tracking will greatly improve, making it more likely that conclusions and assessments based on this data are proper.
Follow me on LinkedIn, and Twitter.
The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.
Related
Cloud Computing: Experts Predict What’s Coming Next In The Business League
Today almost every organization has adopted cloud computing technology into their ecosystem to secure the services simply and rapidly. The inevitable rise the technology raises a question that how will on-demand IT will evolve through the next ten years? Some say that the cloud should be used as a platform for innovation, some say the focus should be on developing localized cloud services. Whereas some experts have quite a different opinion – ‘Consider how online will be the default setting for business operations’ or ‘Keep an eye on the developing capability of staff and providers.’ Below are the views of 4 different industry experts on the rising planet of the
Gregor Petri• Gregor Petri is Research Vice President at Gartner. • He believes that CIOs who are looking forward to embracing the cloud should go beyond lifting and shifting prevailing tech-applications. • They should concentrate on disruption instead of thinking about the cloud as a space that runs present applications. • Petri said, “Focus on a much more applied level of functionality. Look for areas where you can use the cloud as a platform to create unique functionality and special experience. Many of these experiences will be digital.” • Further, he added, “And to do that, you need a slew of supporting services, like voice, search and databases and many of those will be best-supported by the cloud, rather than traditional hardware. Only do what you want to yourself as a business; consume the rest as a service.” • He foresees the future of the cloud as a platform for innovation. • He asserted that CIOs will use on-demand IT resources as a platform to run emerging technologies including AI/ML and quantum computing. • Petri also said, “We’ll be running lots of things in businesses we don’t even have today. These are quite compute-intensive technologies and to get that resource on-premise is a big hurdle. These technologies will also be associated with bursts of activity, so not having to own hardware is attractive.”
Alex von Schirmeister• Alex von Schirmeister is Chief Digital, Technology and Innovation Officer at RS Components. • Talking about his firm he said, “The cloud gives my firm service flexibilities and cost efficiencies that were previously unavailable.” • Schirmeister believes that CIOs thinking of moving to the cloud in the future will anyway encounter non-IT executives who believe embracing on-demand IT pinned with business risk. • He says “If a large cloud-based service goes down, it can wipe out the operational activities of entire companies or even industries. Compliance is also a concern for executives, especially when it comes to the General Data Protection Regulation and the geographical location of data.” • As governments trying to legislate for the storage and use of data, the regulatory guidelines of running cloud arrangements are expected to increase in the future. • As an impact of such continuous legislation, CIOs should consider building much more localized services. • Alex said, “I do think there will increasingly be a notion where various companies start looking at private clouds or virtual clouds that are contained within certain boundaries and barriers. It may be a cloud but it may sit in a specific region or country, but we’ll continue to see an evolution in the form of cloud technology.”
Kevin Curran• Kevin Curran is Professor of Cybersecurity at Ulster University and Senior IEEE, member. • He believes that the definition of the cloud is ‘a scalable foundation for the elastic use of infrastructure which has served a useful purpose in the initial move on demand.’ • Considering the changes in the business industry CIOs should recognize the new definition of the cloud which means bucking up for a future where what goes online stays online. • Kevin said, “Currently, most things are offline by default but being online and connected will become the default for everything. This points to a future where every device will simply connect to the cloud. 5G will support this – it might be fast, but it’s most impressive feature is its enormous capacity.” • He further said, “The cloud will be the foundation of devices that use data at the edge of the network and AI will benefit as a result. We will experience more natural interactions with computers; a superintelligence. This resource combined with fast 5G will serve us with a powerful form of computing that was previously in the realm of science fiction.”
Alex Hilton• Alex Hilton is Chief Executive of Not-for-Profit Industry Body Cloud Industry Forum. • Alex says that the speed of disruptive innovation is very major that any way to predict what the cloud might look like in the coming decade is meaningless. • CIOs continuous march will maintain the rate of transformation in the industry. • He said, “There is a distinct problem with the pace of change for most businesses. In truth, many organizations are not yet on the right trajectory. The constantly evolving technology landscape makes it very difficult for business leaders to move quickly, knowing which horse – in terms of cloud provider – to back.” • Hilton asserted that his organization has witnessed big change during the last 10 years as the company tracked the shift of vendors to on-demand IT. • He believes that CIOs should keep an eye on both the skills of their internal IT teams and external cloud providers. • He said, “Successful companies will be those that embrace disruption – and that will be as true in the future for CIOs as it is for cloud providers.”
Today almost every organization has adopted cloud computing technology into their ecosystem to secure the services simply and rapidly. The inevitable rise the technology raises a question that how will on-demand IT will evolve through the next ten years? Some say that the cloud should be used as a platform for innovation, some say the focus should be on developing localized cloud services. Whereas some experts have quite a different opinion – ‘Consider how online will be the default setting for business operations’ or ‘Keep an eye on the developing capability of staff and providers.’ Below are the views of 4 different industry experts on the rising planet of the cloud • Gregor Petri is Research Vice President at Gartner. • He believes that CIOs who are looking forward to embracing the cloud should go beyond lifting and shifting prevailing tech-applications. • They should concentrate on disruption instead of thinking about the cloud as a space that runs present applications. • Petri said, “Focus on a much more applied level of functionality. Look for areas where you can use the cloud as a platform to create unique functionality and special experience. Many of these experiences will be digital.” • Further, he added, “And to do that, you need a slew of supporting services, like voice, search and databases and many of those will be best-supported by the cloud, rather than traditional hardware. Only do what you want to yourself as a business; consume the rest as a service.” • He foresees the future of the cloud as a platform for innovation. • He asserted that CIOs will use on-demand IT resources as a platform to run emerging technologies including AI/ML and quantum computing. • Petri also said, “We’ll be running lots of things in businesses we don’t even have today. These are quite compute-intensive technologies and to get that resource on-premise is a big hurdle. These technologies will also be associated with bursts of activity, so not having to own hardware is attractive.”• Alex von Schirmeister is Chief Digital, Technology and Innovation Officer at RS Components. • Talking about his firm he said, “The cloud gives my firm service flexibilities and cost efficiencies that were previously unavailable.” • Schirmeister believes that CIOs thinking of moving to the cloud in the future will anyway encounter non-IT executives who believe embracing on-demand IT pinned with business risk. • He says “If a large cloud-based service goes down, it can wipe out the operational activities of entire companies or even industries. Compliance is also a concern for executives, especially when it comes to the General Data Protection Regulation and the geographical location of data.” • As governments trying to legislate for the storage and use of data, the regulatory guidelines of running cloud arrangements are expected to increase in the future. • As an impact of such continuous legislation, CIOs should consider building much more localized services. • Alex said, “I do think there will increasingly be a notion where various companies start looking at private clouds or virtual clouds that are contained within certain boundaries and barriers. It may be a cloud but it may sit in a specific region or country, but we’ll continue to see an evolution in the form of cloud technology.”• Kevin Curran is Professor of Cybersecurity at Ulster University and Senior IEEE, member. • He believes that the definition of the cloud is ‘a scalable foundation for the elastic use of infrastructure which has served a useful purpose in the initial move on demand.’ • Considering the changes in the business industry CIOs should recognize the new definition of the cloud which means bucking up for a future where what goes online stays online. • Kevin said, “Currently, most things are offline by default but being online and connected will become the default for everything. This points to a future where every device will simply connect to the cloud. 5G will support this – it might be fast, but it’s most impressive feature is its enormous capacity.” • He further said, “The cloud will be the foundation of devices that use data at the edge of the network and AI will benefit as a result. We will experience more natural interactions with computers; a superintelligence. This resource combined with fast 5G will serve us with a powerful form of computing that was previously in the realm of science fiction.”• Alex Hilton is Chief Executive of Not-for-Profit Industry Body Cloud Industry Forum. • Alex says that the speed of disruptive innovation is very major that any way to predict what the cloud might look like in the coming decade is meaningless. • CIOs continuous march will maintain the rate of transformation in the industry. • He said, “There is a distinct problem with the pace of change for most businesses. In truth, many organizations are not yet on the right trajectory. The constantly evolving technology landscape makes it very difficult for business leaders to move quickly, knowing which horse – in terms of cloud provider – to back.” • Hilton asserted that his organization has witnessed big change during the last 10 years as the company tracked the shift of vendors to on-demand IT. • He believes that CIOs should keep an eye on both the skills of their internal IT teams and external cloud providers. • He said, “Successful companies will be those that embrace disruption – and that will be as true in the future for CIOs as it is for cloud providers.” • He further added, “Technology skills shortages around the cloud are evident and many of the success stories are from companies who deliver disruptive new ways of thinking or addressing a business need. The providers with foresight and the willingness to invest and be agile will be the winners in the future.”
Moondust Could Chill Out Our Overheated Earth, Some Scientists Predict
In one possible future, great maglev lines cross the lunar surface. But these rails don’t carry trains. Instead, like space catapults, these machines accelerate cargo to supersonic speeds and fling it into the sky. The massive catapults have one task: throwing mounds of moondust off-world. Their mission is to halt climate change on Earth, 250,000 miles away.
All that dust will stream into deep space, where it will pass between Earth and the sun—and blot out some of the sun’s rays, cooling off the planet. As far-fetched as the idea is, it’s an idea that received real scientific attention. In a paper published in the journal PLOS Climate on February 8, researchers simulated just how it might go if we tried to pull it off. According to their computer modeling, a cascade of well-placed moondust could shave off a few percent of the sun’s light.
It’s a spectacular idea, but it isn’t new. Filtering the sunlight that reaches Earth in the hope of cooling off the planet, blunting the blades making the thousand cuts of global warming, is an entire field called solar geoengineering. Designers have proposed similar spaceborne concepts: swarms of mirrors or giant shades, up to thousands of miles across, strategically placed to act as a parasol for our planet. Other researchers have suggested dust, which is appealing because, as a raw material, there’s no effort or expense to engineer it.
“We had read some accounts of previous attempts,” inspiring them to revisit the technique, says Scott Kenyon, an astrophysicist at the Smithsonian Astrophysical Observatory in Cambridge, Massachusetts, and one of the study’s authors.
Kenyon and his colleagues don’t usually dream up ways to chill planets. They study a vastly different type of dust: the kind that coalesces around distant, newly forming stars. In the process, the astrophysicists realized that the dust had a shading effect, cooling whatever lay in its shadow.
[Related: The past 8 years have been the hottest on human record, according to new report]
“So we began to experiment with collections of dust that would shield Earth from sunlight,” says Kenyon. They turned methods that let them simulate distant dust disks to another problem, much closer to home.
Most solar engineering efforts focus on altering Earth’s atmosphere. We could, for instance, spray aerosols into the stratosphere to copy the cooling effects from volcanic eruptions. Altering the air is, predictably, a risky business; putting volcanic matter in the sky could have unwanted side effects such as eroding the ozone layer or seeding acid rain.
“If you could just reduce the amount of incoming sunlight reaching the Earth, that would be a cleaner intervention than adding material to the stratosphere,” says Peter Irvine, a solar geoengineer at University College London, who was not an author of the paper.
Even if you found a way that would leave the skies ship-shape, however, the field is contentious. By its very nature, a solar geoengineering project will impact the entire planet, no matter who controls it. Many observers also believe that promises of a future panacea remove the pressure to curb carbon emissions in the present.
It’s for such reasons that some climate scientists oppose solar geoengineering at all. In 2023, researchers scrubbed the trial of a solar geoengineering balloon over Sweden after activists and representatives of the Sámi people protested the flight, even though the equipment test wouldn’t have conducted any atmospheric experiments.
But perhaps there’s a future where those obstacles have been cast aside. Perhaps the world hasn’t pushed down emissions quickly enough to avoid a worsening catastrophe; perhaps the world has then come together and decided that such a gigaproject is necessary. In that future, we’d need a lot of dust—about 10 billion kilograms, every year, close to 700 times the amount of mass that humans have ever launched into space, as of this writing.
That makes the moon attractive: With lower gravity, would-be space launchers require less energy to throw mass off the moon than off Earth. Hypothetical machines like mass drivers—those electromagnetic catapults—could do the job without rocket launches. According to the authors, a few square miles of solar panels would provide all the energy they need.
That moondust isn’t coming back to Earth, nor is it settling into lunar orbit. Instead, it’s streaming toward a Lagrange point, a place in space where two objects’ respective gravitational forces cancel each other out. In particular, this moondust is headed for the sun and Earth’s L1, located in the direction of the sun, about 900,000 miles away from us.
There, all that dust would be in a prime position to absorb sunlight on a path to Earth. The 10 billion kilograms would drop light levels by around 1.8 percent annually, the study estimates—not as dramatic as an eclipse, but equivalent to losing about 6 days’ worth of sunlight per year.
[Related on PopSci+: Not convinced that humans are causing climate change? Here are the facts.]
Although L1’s gravitational balance would capture the dust, enough for it to remain for a few days, it would then drift away. We’d need to keep refilling the dust, as if it were a celestial water supply—part of why we’d need so much of it.
That dust wouldn’t come back to haunt Earth. But L1 hosts satellites like NASA’s SOHO and Wind, which observe the sun or the solar wind of particles streaming away from it. “The engineers placing dust at L1 would have to avoid any satellites to prevent damage,” says Kenyon.
Of course, this is one hypothetical, very distant future. Nobody can launch anything from the moon, let alone millions of tons of moondust, without building the infrastructure first. While market analysts are already tabulating the value of the lunar economy in two decades’ time, building enough mass drivers to perform impressive feats of lunar engineering probably isn’t in the cards.
“If we had a moonbase and were doing all sorts of cool things in space, then we could do this as well—but that’s something for the 22nd century,” says Irvine. Meanwhile, a far more immediate way to blunt climate change is to decarbonize the energy grid and cull fossil fuels, with haste. “Climate change,” Irvine says, “is a 21st century problem.”
The Best Freesync Monitors You Can Buy
Dell
Finding the right gaming monitor can be rough, and finding the right FreeSync monitor can be even more challenging. We want to make that search a bit easier on you, so we’ve looked around and found a few of the best FreeSync monitors you can buy.
Also read: The best monitors for work and play you can get
See also: FreesSync Vs. G-Sync
Buying the best FreeSync monitor
AMD
FreeSync monitors are available at all kinds of prices, sizes, resolutions, and refresh rates. What’s great is that FreeSync is a free feature, so there has been a wide implementation thanks to the lack of licensing charges. Even NVIDIA has a G-Sync tier called G-Sync compatible, which supports FreeSync.
While picking the best FreeSync monitor, focus on your budget, and use case. Gaming monitors make the best use of this technology since the potential requirement for a solution to screen tearing is also the highest for them. You can then pick the resolution and refresh rate according to your needs, and the specifications of your PC.
Also, make sure your GPU supports FreeSync before you buy a monitor. AMD says supporting devices include Compatible GPUs include all AMD Radeon graphics cards beginning with Radeon RX 200 Series, released in 2013, all newer Radeon consumer graphics products that use GCN 2.0 architecture, and later, and NVIDIA GeForce 10 series and newer. You can check if your GPU model supports it by checking the manufacturer’s website.
Also see: What’s the best GPU for gaming?
The best FreeSync monitors
The Samsung Odyssey G7 is the most well-equipped monitor on the list with a 240Hz QLED display.
The MSI Optix Ultrawide strikes a balance between features, price, and resolution to sum up to a fantastic package.
The Gigabyte M32U is one of the best 4K gaming monitors you can get, and at 32 inches it could double as a media consumption display as well.
The Alienware AW2521HF has a lot of style, and substance to back that up. The 240Hz refresh rate and 1ms response time are some of the fastest out there.
The ASUS TUF VG27VQ has a curved panel for even more immersion into your games. It also hallmarks a 165Hz refresh rate for buttery smooth visuals.
The LG 27QN600 is great for budget-conscious buyers who still want a high-res experience. The display may only be 75Hz, but the QHD panel makes up for it.
The Sceptre C27B probably wasn’t on your radar, but it should be. Fantastic colors and a great refresh rate aren’t the only things on offer here.
The Dell S2421HGF is another bargain display, but this one has a 144Hz refresh rate for those willing to trade ultra-high resolutions for ultra-high refresh rate gaming.
The LG C1 OLED is an honorable mention for its stellar quality. This one will also be great for new gaming consoles.
Playing for the other team? Check out the best G-Sync monitors
The Odyssey G7 features G-Sync compatibility in addition to being a FreeSync monitor. We appreciate the duality here, Samsung. This is a QLED screen, by the way, so you will be getting some of the best visuals you can get on a gaming monitor.
Best ultrawide: MSI Optix Ultrawide
Amazon
For those with an affinity for a wider FOV, MSI’s Optix Ultrawide is one of the best FreeSync monitors out there. This display measures 34 inches diagonally, features a 100Hz refresh rate and has a 3,440 x 1,440 resolution.
Read more: The best 144Hz monitors
Not only will the ultrawide screen be great for supported games, but it will also help when doing work or personal tasks by providing a lot of extra screen real estate.
Best 4K FreeSync monitor: Gigabyte M32U
Amazon
For the highest of the high end, Gigabyte has a beastly display on its hands here. Gigabyte’s M32U is a 4K monitor with a 144Hz refresh rate and of course, FreeSync tech to eliminate screen tearing.
See more: The best 27-inch monitors
This is also a large monitor, coming in at almost 32 inches. This display will give you consistently stellar visuals with no compromise in sight. Conveniently, the Gigabyte M32U also has a built-in KVM switch for cross-computer accessory use.
Alienware AW2521HF
Amazon
Alienware products have always had a specific look to them, and this one is no different. Rounded accents with a turquoise tint give this monitor a futuristic appearance. The AW2521HF FreeSync monitor features a high refresh rate at 240Hz and a 1ms response time for some of the smoothest gaming you can get.
Further reading: The best 240Hz monitors
The panel on this display is IPS and has 99% sRGB coverage for stunning visuals in games and accurate colors when you need them most. Like the Samsung model at the top of the list, this monitor has G-Sync compatibility as well.
Acer Nitro VG270
Amazon
The most remarkable thing about this FreeSync monitor is its price. While it fluctuates quite a bit, it consistently remains an excellent budget offering. It provides a 75Hz refresh rate and a 1ms response time which is just enough to get you started.
ASUS TUF VG27VQ
Amazon
The fact the TUF VG27VQ is curved sets it apart from many, but the 165Hz refresh rate helps it stand out even more. It’s faster than most in its price range, and it’s still a FreeSync monitor, meaning you’ll always get the smoothest possible visuals at any frame rate.
Further reading: The best cheap 144Hz monitors
Curved monitors can help with immersion in games by “wrapping around” your field of view, making it easier to see more of the screen without darting your eyes as much. Something cool to note: this monitor’s stand rotates so you can freely pivot your screen.
LG 27QN600
Amazon
There’s not a short and friendly name for this monitor, but the 27-inch LG 27QU600 is a fantastic FreeSync monitor for the price. You’ll get a QHD IPS screen, a 75Hz refresh rate, and excellent color accuracy with 99% sRGB coverage. It’s also HDR10 compatible for amplified visuals where available.
Other than all that, it also has audio routing on the back so you can use multiple audio sources with the same speakers, or plug in your headphones.
Dell S2421HGF
Amazon
Dell’s gaming monitors are often overshadowed by its Alienware brand, but the non-Alienware displays are nothing to sneeze at. The 24-inch S2421HGF FreeSync monitor has a 144Hz refresh rate, a 1ms response time, and a stand that offers height adjustment and tilt.
There’s not much that’s incredibly remarkable about this one, but it’s a bargain.
Can The Rtx 4070 Do 4K 60Fps
Last Updated on April 13, 2023
Nvidia has just unveiled its brand-new graphics card to some immediate hype, given its marketed 1440p 60+ fps potential metrics. While most of the reviews prove that the card is a welcome addition to the firm’s growing fleet, all eyes are poised on its higher resolution standards. In essence, many wonder if the RTX 4070 is 4K 60fps capable and if the card is ultimately worth that $599 price tag.
RTX 4070 4K 60fps gamingOfficially announced on April 13, Nvidia’s RTX 4070 is already a dream come true for some enthusiast PC builders, due in large part to that somewhat accessible price tag. At that range, however, many might be a bit stunted by the RTX 4070’s overall performance, as it is readily overshadowed by its 3080 counterpart, which debuted back in 2023.
Today’s best modem deals!Custom URL
editorpick
Editor’s pick
Save 13%
ARRIS SURFboard SBG7600AC2 DOCSIS 3.0 Cable Modem
Best Deals
Deal @ Amazon
*Prices are subject to change. PC Guide is reader-supported. When you buy through links on our site, we may earn an affiliate commission. Learn more
Nvidia has stressed its incredibly high metrics tailored for 1440p gaming, making a huge point of it in all of the RTX 4070’s marketing. But, in terms of 4K gaming, the 4070 drops off by a wide margin, and shows that even with DLSS 3, the RTX 4070 isn’t so much of a generational leap that many had been hoping for.
In his review of Nvidia’s newest hardware on The Verge, Tom Warren explicitly states that the RTX 4070 “misses in nearly every game” when it comes to 4K. Linus Tech Tips similarly states in his own review that “outside of Forza, the RTX 3080 ends up outrunning [the 4070] by anywhere from 5% to 19%” when gaming in 4K. That is, however, without ray tracing enabled, upon which as Linus himself states “the gap narrows significantly.”
Thus, while it is certainly one powerful new card, expressed not merely in its multitude of high specs, the RTX 4070 4K 60fps potential falls flat with ray tracing enabled. But, users can expect 60+ fps gaming with ray tracing turned off, depending on the specific title.
Will RTX 4070 be better than the 3080?As mentioned previously, the RTX 4070 is about on par with the 3080, if not a little bit stunted in terms of 4K gaming. As Linus Tech Tips showed in his review, the 3080 outperformed its newest rival in most titles, and may well be a better option for those looking to experience 4K 60fps gaming. The 4070 is tailored for 1440p 60+ fps.
Can the RTX 4080 game at 4K?Yes, the RTX 4080 can run games at 4K, but not optimally. Most, especially with ray tracing turned off, prove to run over 60fps, but while enabled can’t cross the 50-55fps range. That doesn’t mean the card is bad for 4K experiences, just that it’s better suited for 1440p.
Update the detailed information about Can Pagerank Predict The Nhl Playoffs? on the Minhminhbmm.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!