Connect with us

Science

69% of gamers say they “smurf,” even though they hate it

Published

on

A recent study on toxicity in gaming reveals that a significant majority of gamers, 69 percent to be precise, openly confess to engaging in smurfing, even though they vehemently despise it when others resort to smurfing tactics against them.

For those unfamiliar with the term, smurfing may seem like a perplexing concept. Some may even imagine a scenario where 69 percent of gamers paint themselves blue and only communicate using the word “smurf” during their gaming sessions. If that’s your assumption, then you couldn’t be more mistaken.

When engaging in online multiplayer games, developers strive to create a balanced and enjoyable experience by matching players with opponents of similar skill levels. This ensures that players aren’t constantly overwhelmed by opponents who far exceed their abilities. However, individuals have discovered workarounds for this issue. They either create new accounts or borrow existing ones from other gamers to compete against opponents who possess significantly lower skill levels.

In 1996, two players of Warcraft 2 gained such a fearsome reputation for their exceptional skills that fellow gamers would immediately withdraw from matches upon spotting their usernames. When it came to playing the game they had bought, they decided to create additional accounts called PapaSmurf and Smurfette and proceeded to dominate their adversaries using these fresh profiles. The term “smurfing” gained popularity and is now commonly used to refer to players who intentionally create new accounts to compete against less skilled opponents.

A significant number of gamers have reported the occurrence of smurfing, with 97 percent of participants in a recent study acknowledging that they encounter smurfs during their gameplay. The gaming community perceives this behavior as detrimental; despite this, 69 percent of individuals acknowledged engaging in smurfing on occasion, with 13 percent admitting to doing it frequently or almost always.

The study conducted by the team from Ohio State University found that, in comparison to smurfees, participants perceived smurfs as having a higher likelihood of being toxic, disengaging from the game, and enjoying the game. “There were also significant self-other effects observed.” Compared to their own perception, participants believed that other gamers were more prone to displaying toxic behavior, less inclined to continue playing the game, and less likely to derive enjoyment from it.

Upon concluding the study, the researchers solicited feedback from gamers (recruited from Reddit) and discovered various motivations behind smurfing. These motivations included the desire to play with friends of varying skill levels as well as the satisfaction derived from defeating inexperienced players. The team conducted a subsequent study, in which players were asked to assess different justifications for smurfing. They were informed that these justifications were actual reasons provided by smurfs who had achieved victory in the game they were smurfing in. In addition, they were queried about the appropriate degree of retribution to be meted out to the smurf.

The team anticipated that individuals would adopt a “motivated-blame perspective,” wherein they would universally consider smurfing to be morally objectionable, regardless of any justifications.

“According to lead author Charles Monge, this perspective asserts that if an action is deemed wrong, the justification behind it becomes irrelevant as it remains inherently wrong,” as stated in a press release. “The concept is that it should be irrelevant whether you were simply playing casually to join your friends; you caused me to lose this game, and now I am angry.”

Nevertheless, the team discovered that gamers assessed the morality of smurfing on a personal level, categorizing certain forms of smurfing as more culpable than others. They also expressed a desire for stricter penalties for smurfs who had less valid reasons for engaging in smurfing, such as a desire to dominate less skilled players.

A third study discovered that individuals who do not play video games exhibit a similar socially regulated viewpoint, perceiving subtleties in smurfing behavior. Although intriguing due to the commonly associated toxicity in gaming, the team aspires to apply the findings in other areas.

“Games can serve as a powerful tool for testing concepts that extend beyond the realm of gaming,” Monge stated. Studying how blame is assigned in an online setting can provide insights into how blame is assigned in general.

The research is published in the journal New Media & Society.

As Editor here at GeekReply, I'm a big fan of all things Geeky. Most of my contributions to the site are technology related, but I'm also a big fan of video games. My genres of choice include RPGs, MMOs, Grand Strategy, and Simulation. If I'm not chasing after the latest gear on my MMO of choice, I'm here at GeekReply reporting on the latest in Geek culture.

Astronomy

Witness the rare celestial event of Mars and Jupiter reaching their closest proximity in the sky this week, a phenomenon that will not occur again until 2033.

Published

on

Mars and Jupiter will be only 0.3 degrees apart in the sky on August 14. From our point of view, this passage is very close. If you miss it, you won’t be able to see another one until 2033.

When two objects pass each other in the sky from our point of view, this is called a conjunction. Every time two planets came together, the closer one would block out the other because they would all be moving in a perfectly flat plane. The orbits of the planets are slightly different from those of the other planets, though, so they move slightly to the north and south of each other. Every time, that gap is a different size.

When two things happen close together, the results are especially stunning. Jupiter and Saturn were close enough to each other in 2020 that they could be seen in the same field of view through a telescope. This is a treat for people who like to observe the sky.

Being 0.5 degrees wide, the full moon will fit in any view that can hold the whole moon. This pair will also look good before and after the full moon.

But even with the naked eye, a close conjunction can make the sky look even more amazing. The contrast between the red of Mars and the white of Jupiter will be especially striking. However, Mars’ brightness changes a lot. When it’s at its brightest, it’s about the same brightness as Jupiter. Right now, it’s 16 times less bright. They are so bright that, unless there are clouds, you should be able to see them from all but the dirtiest cities.

Most people in the world will miss this sight, though, because they can’t see the pair of planets in the evening from anywhere on Earth. The exact time they rise depends on where you live, but it’s usually between midnight and 3 am. To see this, you will mostly need to get up before astronomical twilight starts so that you have time to get through the thickest part of the atmosphere.

For people in Europe, Africa, west Asia, and the Americas, the closest time will be 14:53 UTC, which is during the day. The mornings before and after, though, will look almost as close.

Mars and Jupiter meet about every two and a half years, but the most recent one was almost twice as far away and could only be seen in the morning. In 2029, the gaps will be just under two degrees. The next one will be even wider, at more than a degree.

When planets are close to each other, that doesn’t always mean that their distance from each other is very small. Mars has been around the Sun for 687 days, but it is now less than 100 days past its perihelion, which means it is closer than usual. Even though Jupiter is a little closer than usual, it’s not really that close. To be as close as possible to each other, Mars has to be at its farthest point, and Jupiter has to be at its closest point. So this one is not unusual.

But if you want to see something beautiful, you will have to wait more than nine years to see it again.

Continue Reading

Medicine and Health

A recently identified strain of deadly fungus poses a significant risk to public health

Published

on

Researchers have recently discovered a new group of Candida auris, a potentially dangerous pathogen. The finding increases the total number of identified clades of the fungus, which is a newly emerging superbug resistant to multiple drugs, to six.

Candida auris is a strain of yeast that has the potential to cause serious illness and is frequently impervious to antifungal drugs. While individuals who are in good health generally do not fall ill, the transmission of the disease is highly prevalent within medical institutions and poses a significant risk to individuals with compromised immune systems. The yeast can induce a variety of conditions ranging from superficial infections of the skin to more severe and life-threatening illnesses, such as bloodstream infections. Due to its high level of resistance to multiple drugs, treating it can be challenging, and in some cases, even impossible.

The authors state that the pathogen is a significant global public health threat due to its widespread distribution, resistance to multiple drugs, high ability to spread, tendency to cause outbreaks, and high mortality rate. Although infections are still relatively uncommon, there has been a significant increase in cases in recent years.

Previously, the fungus had been categorized into five distinct clades, each located in different geographic regions: South Asia, East Asia, Africa, South America, and Iran.

In April 2023, doctors from the Singapore General Hospital identified a patient carrying a unique strain of C. auris as part of a routine screening program, adding it as the most recent clade to be discovered. Typically, these cases arise from individuals who have recently traveled, but this particular patient had not traveled outside the country for a period of two years, which raised some concerns.

Upon conducting a genetic analysis of the strain, the researchers determined that it did not align with any of the five known clades of the fungus. Therefore, it can be concluded that the strain belongs to a previously unidentified, sixth clade. Subsequently, they conducted tests on strains obtained from previous patients and identified two additional isolates of this particular group of C. auris in Singapore, as well as another isolate in Bangladesh.

The extent of the new clade’s prevalence and its potential to cause invasive infections and outbreaks remains uncertain at present. However, the researchers emphasize the importance of promptly identifying and controlling it in order to safeguard patient well-being.

“The ramifications of this breakthrough transcend the confines of the laboratory.” “Given the recent discovery of the sixth Candida auris clade, it is imperative to enhance surveillance capability or create new methods to strengthen existing surveillance strategies. This will enable health care facilities to closely monitor its emergence and effectively control its spread,” stated Dr. Karrie Ko, co-first author of the study.

Fortunately, the cases described in the study remained vulnerable to all antifungals that were tested. This should alleviate concerns about a pandemic similar to the one depicted in The Last Of Us. However, it is evident that the threat of C. auris is persistent. Therefore, additional efforts are required to identify new strains, monitor their spread, and control any negative clinical consequences.

The research is published in The Lancet Microbe journal.

Continue Reading

Psychology

People’s morals don’t change with age; they change with the seasons

Published

on

A new study shows that when people in wealthy English-speaking countries are asked what moral values they value most, their answers change depending on the time of year. The difference isn’t huge, or else people would have noticed it before, but it is big and has been surprisingly stable over many years. This may affect people’s decisions when it comes time to vote.

Drawing on the work of More’s contemporary Whittington, Robert Bolt called Sir Thomas More “A Man for All Seasons.” Bolt was talking about how More stayed true to his values even as the political climate changed around him. The implication was that this didn’t happen very often, but “seasons” was used as a metaphor, and no one thought that it might have something to do with the actual seasons of the year.

But researchers from the University of British Columbia checked and found that for many of us, the season does change what we think.

For decades, social scientists have asked people how important values like loyalty and kindness are to them in order to figure out how they act.

For his study, UBC doctoral student Ian Hohm used data from yourmorals.org to look at whether people’s views on these values change with the seasons. He discovered that they do, but in some strange ways.

Values that are important to traditional conservatism include purity, loyalty, and respect for authority. These values often go together and are called “binding values,” so it makes sense that they rise and fall together every year.

It’s strange that this trio isn’t most valued in the summer or winter, but in the spring and fall. It’s seen as less important in the summer and winter. The effect was much stronger than what was needed to pass statistical significance tests. It stayed strong even when older and wealthier people were taken into account because they were more likely to respond in the spring and summer.

 

 

 

 

 

 

 

 

 

 

 

 

 

Fairness and compassion, which liberals and progressives frequently emphasize, changed less over the course of the year. When they did, the pattern was less clearly tied to the seasons. So, in terms of relative power, they were more important in the summer and winter.

Holm said in a statement, “People’s support for moral values that help groups stick together and follow the rules is stronger in the spring and fall than it is in the summer and winter.” “Because morals are such an important part of how people decide what to do and how to judge others, we believe that this discovery could only be the beginning of many more effects that will happen in the future.”

In the past, this means that the fact that US elections had to happen at the beginning of November may have given conservatives a small boost. No one knows if American history would be very different if it were held in the summer or the winter. It’s also impossible for this poll to say if this advantage has been lost since some people who call themselves “conservatives” have shown open disrespect for science.

In places where the prime minister or president picks the election date, the results might be even more interesting to them. When the authors looked at other English-speaking countries, like Australia and Canada, they saw the same pattern. But in the UK, support for values that people have to follow dropped sharply in the summer and reached its highest point in the winter. But in all of these cases, the sample size was less than a tenth of what is available for the US. This means that these results should be taken with more care.

When there is a big difference between summer and winter, it makes sense that people’s values will change more with the seasons than when there aren’t many seasons. There was a lot more variation in Canada than in Australia. However, the authors didn’t look at the variation in the US by state to see if, say, Alaskan values vary more than Floridian values.

For this pattern to make more sense, winter and summer should be opposites instead of close matches. However, the mood of the seasons did give the authors a big clue. A follow-up study that used the same methods found that Americans are more stressed in the spring and fall. Professor Mark Schaller, the senior author, said, “This link suggests that people who are more anxious may look for comfort in the group norms and traditions that are upheld by binding values.” Schaller just put out a paper about how seasons affect other parts of our minds.

The information was gathered once a week for ten years, and it always showed the same pattern. This means that it probably wasn’t too skewed by events that happened during a certain season. But then again, anniversaries like September 11 might be a long-term factor that has nothing to do with the weather, and the authors think Christmas might have some effect as well.

They also say that these trends might not just have an effect on elections. Criminal convictions show a lack of respect for authority, and it should be looked into further to see if judges give harsher punishments during certain times of the year because of this. The authors use the COVID-19 pandemic as an example of how morals affect how we handle crises. Knowing that loyalty and respect for authority are higher some days than others could change who public health campaigns are aimed at.

The study is written up in the journal Proceedings of the National Academy of Sciences.

Continue Reading

Trending