Psychology
Most babies and small children are now using mobile devices
Mobile devices have been changing the world for the past decade or so and not always for the better. I don’t know about you, but when I was a child nobody had their own personal phone (let alone the still to be invented smartphone), as such devices were usually there for the whole family to use. Skip ahead a couple of decades and today it seems like most babies and very young children are already being introduced to mobile technology, a lot of the time before they are even able to walk or speak properly. The consequences of doing so might not seem immediately obvious, but it’s pretty easy to imagine how the current generations will be entirely dependent on technology when they grow up.
Needless to say, babies don’t need mobile devices, so why even show them such things? There is no clear answer to that question at the moment, but what is clear is that a new study recently presented at the Pediatric Academic Societies annual meeting in San Diego reveals some worrying figures. According to Hilda Kabali and her colleagues, most babies of ages 2 and younger are already spending a decent amount of time in front of a smartphone or tablet screen. The numbers vary with age, but the behavior can be encountered even in about 1 in 7 babies under the age of 6 months, with the percentage increasing to 1 in 3 by the age of only 1 year.
Perhaps it’s not entirely accurate to say that two-year old children are “babies”, but these statistics should still raise a red flag regardless. It goes without saying that parents shouldn’t promote reliance on technology from such a young age, and yet, they do. The Pediatric Academy advises parents against letting their children play with smartphones or tablets, and with good reason if you ask me. Given that babies are unlikely to look for mobile devices on their own if they don’t know about them, this shouldn’t be a problem for responsible parents. On the other hand, parents who fail to listen to what pediatricians have to say might want to know that there’s very high chance that their children could develop a certain degree of addiction to mobile devices when they’re older.
Psychology
People’s morals don’t change with age; they change with the seasons
A new study shows that when people in wealthy English-speaking countries are asked what moral values they value most, their answers change depending on the time of year. The difference isn’t huge, or else people would have noticed it before, but it is big and has been surprisingly stable over many years. This may affect people’s decisions when it comes time to vote.
Drawing on the work of More’s contemporary Whittington, Robert Bolt called Sir Thomas More “A Man for All Seasons.” Bolt was talking about how More stayed true to his values even as the political climate changed around him. The implication was that this didn’t happen very often, but “seasons” was used as a metaphor, and no one thought that it might have something to do with the actual seasons of the year.
But researchers from the University of British Columbia checked and found that for many of us, the season does change what we think.
For decades, social scientists have asked people how important values like loyalty and kindness are to them in order to figure out how they act.
For his study, UBC doctoral student Ian Hohm used data from yourmorals.org to look at whether people’s views on these values change with the seasons. He discovered that they do, but in some strange ways.
Values that are important to traditional conservatism include purity, loyalty, and respect for authority. These values often go together and are called “binding values,” so it makes sense that they rise and fall together every year.
It’s strange that this trio isn’t most valued in the summer or winter, but in the spring and fall. It’s seen as less important in the summer and winter. The effect was much stronger than what was needed to pass statistical significance tests. It stayed strong even when older and wealthier people were taken into account because they were more likely to respond in the spring and summer.
Fairness and compassion, which liberals and progressives frequently emphasize, changed less over the course of the year. When they did, the pattern was less clearly tied to the seasons. So, in terms of relative power, they were more important in the summer and winter.
Holm said in a statement, “People’s support for moral values that help groups stick together and follow the rules is stronger in the spring and fall than it is in the summer and winter.” “Because morals are such an important part of how people decide what to do and how to judge others, we believe that this discovery could only be the beginning of many more effects that will happen in the future.”
In the past, this means that the fact that US elections had to happen at the beginning of November may have given conservatives a small boost. No one knows if American history would be very different if it were held in the summer or the winter. It’s also impossible for this poll to say if this advantage has been lost since some people who call themselves “conservatives” have shown open disrespect for science.
In places where the prime minister or president picks the election date, the results might be even more interesting to them. When the authors looked at other English-speaking countries, like Australia and Canada, they saw the same pattern. But in the UK, support for values that people have to follow dropped sharply in the summer and reached its highest point in the winter. But in all of these cases, the sample size was less than a tenth of what is available for the US. This means that these results should be taken with more care.
When there is a big difference between summer and winter, it makes sense that people’s values will change more with the seasons than when there aren’t many seasons. There was a lot more variation in Canada than in Australia. However, the authors didn’t look at the variation in the US by state to see if, say, Alaskan values vary more than Floridian values.
For this pattern to make more sense, winter and summer should be opposites instead of close matches. However, the mood of the seasons did give the authors a big clue. A follow-up study that used the same methods found that Americans are more stressed in the spring and fall. Professor Mark Schaller, the senior author, said, “This link suggests that people who are more anxious may look for comfort in the group norms and traditions that are upheld by binding values.” Schaller just put out a paper about how seasons affect other parts of our minds.
The information was gathered once a week for ten years, and it always showed the same pattern. This means that it probably wasn’t too skewed by events that happened during a certain season. But then again, anniversaries like September 11 might be a long-term factor that has nothing to do with the weather, and the authors think Christmas might have some effect as well.
They also say that these trends might not just have an effect on elections. Criminal convictions show a lack of respect for authority, and it should be looked into further to see if judges give harsher punishments during certain times of the year because of this. The authors use the COVID-19 pandemic as an example of how morals affect how we handle crises. Knowing that loyalty and respect for authority are higher some days than others could change who public health campaigns are aimed at.
The study is written up in the journal Proceedings of the National Academy of Sciences.
Psychology
WHO Recognizes Gaming Disorder as a Mental Health Condition
There are several issues surrounding gamers’ mental health when it comes to excess and other risky aspects. The long asked question of “Can gamers become addicted to video games?” has been answered by the World Health Association (WHO) quite recently.
The WHO is going to be adding Gaming Disorder to its International Classification of Diseases in 2018. According to New Scientist, the WHO will officially recognize obsessive gaming disorders as a mental health condition
I know this might seem like the World Health Organisation just aims to push the “All gamers are meanies” agenda. However, that is far from the case as representatives from the Association have made clear that there is a clear difference between a gaming addict and a gamer.
According to a current draft, the criteria include making gaming a priority “to the extent that gaming takes precedence over other life interests”, and continuing this despite the risk of it being detrimental to your health – such as lack of sleep and sustenance. However, this behavior must be observed for at least a year before diagnosis can be confirmed.
In other words, if you play games like Super Mario Odyssey or Cuphead for a few hours and take breaks to drink water or move around, then you simply don’t have an addiction. However, not even I can deny that there are some sick individuals out there who have gone to awful lengths to satisfy their gaming cravings.
“Health professionals need to recognize that gaming disorder may have serious health consequences,” Vladimir Poznyak at the WHO’s Department of Mental Health and Substance Abuse told New Scientist.
Now, there are some bad parts of this problem, namely the fact that people will make a stigma out of this. Subsequently, there’s a fear that people will mistakenly label a common gamer as an addict because they play games for more than 20 minutes. Considering the world we live in loves to pin us with Alt-Right terrorists, this isn’t really an unfounded claim.
Then there’s the problem about how while the WHO has been calling out gamers and their activities. Nobody in the organization wants to admit there is a problem with smartphone users and apps. This is a major concern as well because there are also people who have done outlandish actions for games like Farmville or even the smartphones themselves.
There are multiple factors that play a role into whether or not to call a gamer an addict. So we shouldn’t panic too much about this new measurement taken by the WHO.
Apps
Study: App Notifications Worsen the Mood of the User
Do you find phone notifications annoying? I certainly do, mostly because they get in the way of my song when I’m listening to music. And when you have multiple apps, all you need is a bit of data connection to ruin your day. And now, a study corroborates that smartphone alerts end up worsening the mood of the user.
Researchers at the Nottingham Trent University in the UK studied the effect on mood in 50 participants who received thousands of digital alerts over a five-week period. Out of more than half a million notifications, they found that 32 per cent resulted in negative emotions.
What are the factors that cause such a negative impact? Well, the context behind the alerts is usually related to non-human activity. A few examples are general phone updates and Wi-Fi availability. The research group found out that Work related notifications also affect people’s mood in a negative way. The problem only worsens when these notifications are received in bulk.
“These digital alerts continuously disrupt our activities through instant calls for attention,” researcher at Nottingham Trent University Eiman Kanjo, said to The Telegraph. “While notifications enhance the convenience of our life, we need to better-understand the impact their obsessive use has on our well-being,”
So, how was the procedure done? The research group created an app called NotiMind. Which the volunteer participants downloaded shortly after. The app collected details relating to the phones digital notifications, as well as participants self-reported moods at various points in the day over a five-week period.
Not everything is doom and gloom though, as there was some positive results when it came to notifications from friends. Especially when the participants received various messages at once. The reason for this is because these notifications created a sense of belonging and feelings of connection to a social group.
So, that’s what the report says. People usually get annoyed by the fact that notifications interrupt the important occasions in life. Often, I hate to be reminded that I didn’t turn my Wi-Fi off and get a notification saying that there’s a network nearby. But hey, maybe someday we can filter out these alerts so that we can focus on the important things.
- Gadgets9 years ago
Why the Nexus 7 is still a good tablet in 2015
- Mobile Devices9 years ago
Samsung Galaxy Note 4 vs Galaxy Note 5: is there room for improvement?
- Editorials9 years ago
Samsung Galaxy Note 4 – How bad updates prevent people from enjoying their phones
- Mobile Devices9 years ago
Nexus 5 2015 and Android M born to be together
- Gaming9 years ago
New Teaser For Five Nights At Freddy’s 4
- Mobile Devices9 years ago
Google not releasing Android M to Nexus 7
- Gadgets10 years ago
Moto G Android 5.0.2 Lollipop still has a memory leak bug
- Mobile Devices9 years ago
Nexus 7 2015: Huawei and Google changing the game