- new
- past
- show
- ask
- show
- jobs
- submit
I don't think it's that compelling to say "obviously no one wants to be on Instagram and they're getting manipulated into it." ...yeah they do! The question is can you make a compelling case that spending time on it is harmful.
I’ve been using the internet for longer than I care to admit, and I’ve never seen anything like it.
It was like 300 million junkies all lost their drug supplier at the same time.
That timeline has way more to do with the corrupt politicians than consumer behavior.
_______________
Both in the sense that the original semi-bipartisan law should've been ruled unconstitutional [0], and also in how the Republican party turned around and broke portions of that law for months until Trump could ensure the assets were handed to his major donor buddy--and fixing none of the original PRC influence issues. [1]
[0] https://www.aclu.org/news/national-security/banning-tiktok-i...
[1] https://www.techdirt.com/2025/12/19/tiktok-deal-done-and-its...
So looks like politicians never had any problem with the addictiveness of social media, they only have a problem when it's used by foreign adversaries and not by domestic companies...
Are you even remotely surprised by that? Honestly.
Mass misery is still misery.
I have to disagree with this. Having talked to heroin addicts in the past, I was told that the heroin addiction destroyed their relationship with their families and their friends, causing heart-break in the process (particularly for mothers). They use everyone around them so that they could get their next fix: borrowing, constant cajoling and stealing results in alienating them from everyone in their social circle – other than fellow junkies.
When cut off from family and friends, junkies resort to begging, stealing, street prostitution, shop-lifting and other petty crimes, all of which have a negative effect on their community. Some junkies end up committing violent crimes which has a more destructive effect on society. They often end up in debt to their dealers and commit other crimes at their behest.
All these things are much worse when the junkie is a parent or has others depending on them for a safe and secure family life.
Also, in my country (Ireland), heroin junkies also place a huge burden on the health service. Their chaotic lives result in multiple health issues and they take up a significant portion of hospital beds.
How do you make sure that whoever makes that choice makes it in a way you yourself will agree with?
People's lives are ruined by gambling all the time, for instance. It is dumb to pretend like the pleasure a few people get out of it is worth someone betting away his family's welfare. It is ok to just decide "this needs to be regulated." Not everything is some intractible philosophical mystery that no consensus will ever coalesce around. Not every single thing every single person wants needs to be taken seriously.
If you're going to get philosophical, go all the way. Why have society at all because it's just people imposing their will on others? Or do you at least agree that there exists a line?
It’s not at all obvious that “adults can’t have TikTok” is anywhere near the correct side of that line.
The TikTok ban successfully forced the sale of the US TikTok operations. I wouldn't be so dismissive of it.
No, it was not. It was actually nothing like that.
No babies were left to die because their parents were out searching for tiktok clips. I saw no people whoring themselves on the street just to see a few tiktok clips. I heard no stories of children stealing from their own family to get a few scrolls of tiktok. There was no people killing each other just to get a hit of tiktok.
Let's not trivialize something like drug addiction by comparing it to kids procrastinating by watching their TV phone app.
I'm just astonished how hard all of the supposedly rational engineering minds of hackernews are falling for this classic moral panic. The crowd of mindless pitchforks is cringe.
It must be a cognitive gymnastics that makes people here feel more important. How powerful it must feel to believe your email job can addict and destroy the world...via...javascript scroll effects on...mobile entertainment apps.
I mean, how else do you rationalize the fact that you're paid as much as a heart surgeon to implement react components and reply thumbs up to messages on slack? All this doomsday cosplaying must help square the cognitive dissonance.
Instagram was supposedly the same, with Meta internally knowing that. They said it themselves, the teenagers couldn’t stop using Instagram even if they wanted to. I mean, isn’t that addiction?
I don’t need to feel important. I’m an addict trying to stay away from my triggers. It’s not Instagram, but I also know how that one feels, because I had an account for years. Of course I’m not saying it’s exactly like a drug — any drug —, but that to dismiss the very real, very negative design of these tools is also folly. They hijack the same brain chemistry, to similar results, and a different scale of recovery.
No, developers aren’t special. Nobody in tech is. But Instagram themselves, in their own document, are basically admitting to behaving like a very capable dealer of a neural drug.
The median child of a social media user (so basically, the median child) is vastly more well off than the median child of a heroin/crack cocaine user and its not even close.
The fact you're suggesting some level of equivalency is wild imo.
Glad I could draw attention to the irrational logic of the current "social media is evil" moral panic.
I want to follow news and deals from a handful of vendors and local businesses I like a lot. The best way to do that is following them on instagram. It’s the only reason I signed up and installed the app. If it’d been one or two, I’d not have bothered, but it’s that way for lots of them.
I never want to see the “feed”. I would disable it if I could. I would make it default to my “following” view if I could. Instagram so very much wants me not to do that that they went out of their way to make it impossible to achieve that even with iOS’ built in shortcut-like system (you used to be able to).
As a result, sometimes I get distracted by one or two of the top items on the feed. That doesn’t mean I actually want to see them. That I open the app once every couple days doesn’t mean I like the app. I think it’s terrible.
People taking what folks do with a sharply constrained set of options as an expression of “why they want” or revealed preference or whatever is frustratingly wrong.
I can't say I know anyone who defends extended social media usage. Do you?
I sort of claimed that everyone enjoys it when they use these apps, maybe it's better to say they are likely getting something out of it in that moment. This could be kind of a bad deal - people make bad deals, and repeat old ones all the time. Other times they delete the app once they realize it.
I took a trip to Yosemite last weekend and took the (rare) opportunity to post a reel. All of the comments and reactions are DMs. It feels so lonely and weird and isolating. Who asked for this?
I miss the days where you shared things, and people actually commented on them and interacted with each other as well as the poster. And where it wasn't ephemeral.
By that I mean- is the product addiction, with a shroud of media, or is it media which just happens to be addictive.
The entire revenue model is based on on engagement and clicks, the product is incentivized to maximize time spent on the service at any cost. Addiction is a core engineering requirement.
facebook in the past has done tests of emotional manipulation on their users without informing them
they're rotten from the head down
It's the former, by design:
Now if only the dick heads running this complete rag could listen to the wonderful people who wrote that enlightened piece and let users unsubscribe: https://www.reddit.com/r/assholedesign/comments/rli0u9/how_t...
I've written more about this here: https://klemenvodopivec.substack.com/p/recommender-systems-n...
The most important evidence was just internal research saying exactly what the plaintiffs wanted.
Google Chrome is trying hard to become a mandated technology, but hasn't quite succeeded yet.
The author made a choice to publish there. They want the paywall, because that's how they get paid for this writing.
Huh? Does anyone actually care any more? The kind of moralizing busybodies that spend their time shaming the tobacco industry are few and far between.
RJ Reynolds does not have their pick of the most elite graduates. Most of them would be ashamed to tell their friends and family, no matter the salary.
Facebook is not that.
https://journals.sagepub.com/doi/10.1177/26318318221116042
snippet from the abstract
> Contrary to the earlier notion that addiction is predominantly a substance dependency, research now suggests that any source or experience capable of stimulating an individual has addictive potential. This has led to a paradigm shift in the psychiatric understanding of behavioural addictions.
dopamine, the little “hit” you get on social media sites or when you get a “ping”, has a massive role to play in behavioural addictions. and with behavioural addiction it basically causes the same stuff in the brain that cocaine etc does (very simplified explanation).
also, i’m a recovering drug addict. and i can tell you for sure from my lived experience that addiction is definitely not limited to physical stuff like drugs. xD
Addiction isn't just [chemical in blood stream] -> [addiction]. Addiction involves many steps, many of them in the brain, and many of those reactive to non-physical events.
gonna need a citation on that one, dawg
Plus, smoking doesn't kill people; its pathological outcomes do. Similarly, looking at a phone screen might hurt a user's eyes, but it won't kill them; however, the decisions that user makes over time due to the effects of the subject matter they interact with might definitely put them at risk. And if aspects of that subject matter are deliberately amplified for their addictive properties, should platforms be regulated to control this?
No Javascript, no CAPTCHA, no DDoS^1, no geo-blocking, other nonsense^2
echo '
url https://www.economist.com/by-invitation/2026/04/29/stop-big-tech-from-making-users-behave-in-ways-they-dont-want-to
user-agent "Mozilla/5.0 (Linux; Android 14) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/127.0.6533.103 Mobile Safari/537.36 Liskov"
header accept:
output 1.htm
'|curl -K/dev/stdin
firefox ./1.htm
1. https://gyrovague.com/2026/02/01/archive-today-is-directing-...2. What's up with the LinkedIn reCAPTCHA sitekey in the page source
How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?
I honestly can't tell if this is serious or satire, so apologies if missed the joke.
Pushing a git repo to a new server is built into git itself.
Github project data is easy to export: https://docs.github.com/en/issues/planning-and-tracking-with...
There are import tools for many competing projects that will transfer it over in various ways.
Only the project owner can do that.
I don't know how you'd write it in a law either, but if you're in a meeting at your tech company, and the product owner or tech lead uses language like "We need to get users to do..." and "We need to incentivize..." and "It should be easy to do X and hard to do Y..." then do whatever is in your power to steer/stop. You're not really building a product users want, you're pushing a behavior-modification scheme onto users.
> you're pushing a behavior-modification scheme onto users
In general I think that your comment is reasonable. I just would like to point out that such "behavior-modification" schemes are sometimes introduced for genuinely good and ethical reasons.
For instance, it is in my opinion desirable to make it more difficult for users to delete all their photos by e.g. having to confirm their decision in a dialog first. Because it prevents them from accidentally doing something they might not want to do and which is potentially impossible to revert.
One famous case was Apple suing Samsung over patents. Hard to prove until internal comms surfaced showing intent to copy the iPhone.
For laws like this it always boils down to "I'll know it when I see it" which is such a shockingly poor way to write legislation that I'm flabbergasted it doesn't immediately fail any amount of rudimentary scrutiny. Not to mention the latitude it grants for selective enforcement. It's basically Washington asking (through the Economist) for a leash on platforms that host their critics that they can yank at any time the population gets too rowdy, with the convenient justification that the algorithm is too good and our attention spans are in danger or whatever.
If it were so easy, we'd do this all the time. We already do it a lot, and there are heaps of examples where it goes wrong.
Infinite scroll is one obvious one. As well as forcing algorithmic feeds of accounts we don't follow.
That would be a lot of extra work for the platforms, but I think the results would be interesting. It amounts to legislating that certain features have to be optional and configurable.
Edit: I know what network effects are, I was talking about steps individual users can (and should IMO) take. We should be helping our friends, family and neighbors find safe and health alternatives like Signal for comms. Build different networks that are actually social and not doomscrolling.
Still amazes me how engineers on HN are in awe of AI and LLMs knowing that 90% of us will be affected (we won't be able to bring money to the table) once the higher ups start to normalize even more the usage of AI to reduce headcount. Not everything is about the technical details people, grow up
It's deeply sad to see how our most beloved work (those side projects we pour ourselves into purely for the joy of it) will, at the end, be the very reason most of us lose our jobs (not all of us, but the majority). Openai/antrhopic/etc and others simply took all of that and turned it to their advantage. It's capitalism, sure, but it's heartbreaking... I wouldnt mind be out of job for another reason, but not for that one pls
I'm not blind, I have Claude pro (not max) and Cursor subscription. But I'm really hesitant to go balls to the wall on the most powerful models because it isn't sustainable; I don't want it to be. So how much can I get from the older models, the smaller, cheaper ones that will hopefully inevitably be commoditized. I think the harness improvements are making headway. I continue to think Cursor Composer 2 is more than adequate.
Then again if one believes it's a race to the singularity, then that's another story. I don't.
LLMs are objectively smarter than any one person so in some definition we've already created super-intelligence. The problem is they just sit there. They have all the answers already, if you think about it. Whenever we ask it something it gives us the answer, it's amazing, we can even say it can synthesize new information. We can agree with all the claims.
But what does it do with that super-intelligence? Nothing. It can't. it doesn't have will. Or interest. Curiosity? Biological imperative. Who knows.
So we create loops and introspection and set them free. Does giving AI a goal make the AI conscious? That's easily silly if you ask me.
(I'm trying really hard not to make this philosophy. I really like the philosophy aspect, but this is my 30 second answer to the question)
I am no philosopher but https://poc.bcachefs.org/ seems conscious.
It's no more conscious than running that cron job to send you today's weather. That's as far as I understand what this link is. The agent is posting blog updates and such. Because it was told to. It has no will. LLM generative output is incredible. It's also not conscious.
I'm a mid programmer at best, like compared to top guys in the industry, who built stuff like OpenClaw or those prodigy 16 year-old coders who became millionaires, and yet I don't fear the LLM assisted coding future. I'm at peace knowing that I will adapt to the LLM programming world using my knowledge in my favor, or adapt to a world where I will no longer be a SW engineer, but something else.
Also I find it ironic and poetic how some SW devs here want us to rise up and fight LLMs and the companies making them for disrupting this profession, when the SW dev profession was so well paid precisely because the SW products they wrote, disrupted other peoples' professions, moving the savings from labor costs into the pocket of employers, who used SW to optimize processes and repetitive labor and not have to hire as many people, yet they never saw an issue with other people losing their jobs. "Learn to code" eh?
Oh how the turntables.
Then why hasn't anyone else done it before?
With hindsight, it's always easy to say anyone could have done it too, but there's more to product success than just coding and shipping an app out the door.
The first iPhone was built using COTS(commercial off the shelf) parts that Nokia, Ericsson and Motorola also had access to, and SW tools they also had access to, yet Apple won and buried the other companies because their end-product was way more popular with the customer base. I'm sure engineers from Nokia, Ericsson and Motorola also said "we could have done exactly the same thing with the right leadership" when they saw that.
I also say "I could have done that" when I see how the maker of Flappy Bird became a multi millionaire, or how any other top 100 AppStore slop app has 100+ million downloads.
Coding skills are dime a dozen these days. A lot of people can do 95% of these things now. The differentiator between failure and success, comes with the 5% rest: network effects, market know-how, promotion, timing, outreach, UI, UX, luck, etc.
There are some things I could easily say I (and many others) could not build even in retrospect. Solidworks, for example is beyond a lot of people’s skill level and very difficult to build.
Flappy bird and open claw, not so much.
After years of near monopoly status these companies have a lock on many people's social lives. To give up Instagram is akin to giving up text messaging. "Just stop using it" isn't helpful advice to those people.
If Instagram disappeared tomorrow it would be different, because everyone would be in the same position. But preaching personal responsibility in an area subject to network effects doesn't work.
Now, would it be inconvenient to stop, sure, but people need better self control. Put that cookie down!
That's a straw man argument. I never said they were.
> There are even studies that show that it makes their users depressed.
What percentage of the population do you think are in the habit of reading academic studies about the effects of the products they use?
It all feels reminiscent of cigarette smoking. The damage was very well known yet people continued to do it. It took extensive government regulation to wean people off their addiction, not a "buck up, chump" motivational message.
What works for you, and me actually, doesn't work for most people, humans are complex things
Would you place all the responsibility of drug addiction on drug dealers?
Yes, their practices are predatory, but it is essential to remind the addicts that ultimately change comes from within themselves. They need to change something.
That's asking every company to prove a negative before rolling out new features.
Could we have a regulatory agency that keeps an eye on dark patterns and deals with them as evidence emerges that something is harmful.
That’s not as rediculous as it seems. That’s sort of model that drug manufacturers follow. It would also mean that if internally they see troubling behaviour they know they have to stop.
Practically, it would be corporate cover up. And applied earnestly it would make these businesses unviable.
Internal testing showed these features were addictive. They had resources allocated to creating addictive experiences for tweens.
The underlying behavioral science is well studied, down to the causal level.
Dark patterns are designed to make it hard to exit and unsubscribe. The language is purposefully obtuse, the options buried behind menu choices. We have enough A/B testing data to know how effective friction is at dissuading people from following a path.
How are we proving a negative here?
> The question is whether the company can show, before rolling a product out to billions of people, that it is not predatory by design.
"Not predatory" is a negative
Found this document:
https://www.economist.com/by-invitation/2026/04/29/stop-big-...
Headlines (quote):
Instagram is an inevitable and unavoidable component of teens lives. Teens can’t switch off from Instagram even if they want to.
Instagram has become the ID card of this generation. It is the go-to tool for both measuring and gathering social prestige.
Instagram sets the standards not only for how teens should look and act but also for how they should think and feel.
Teens feel themselves to be at the forefront of new social behaviours to which there is no consensus on how to behave or cope. They sorely lack empathetic voices to whom they can turn for support.
Teens talk of Instagram in terms of an ‘addicts narrative’ spending too much time indulging in a compulsive behaviour that they know is negative but feel powerless to resist.
The pressure to ‘be present and perfect’ is a defining characteristic of the anxiety teens face around Instagram. This restricts both their ability to be emotionally honest and also to create space for themselves to switch off.
Anxiety around what to post and the potential cost involved in posting the wrong thing means teens are switching from proactive to passive engagement with the platform.
Insert credit card and two forms of id to log on...
We're going to get better and better at hacking the human brain - for good and evil and we're going to have to trade some free will and personal liberty to really keep the worst of it in check. The dark pattern bullshit is the easiest thing to regulate but I don't have a lot of hope for even that.
Firms can optimize as they like, but if the net result is that the market ceases to function, then those behaviors get penalized.
This leads me to think about the idea of procrastination as a mechanism of gambling by the sub-conscious. A subversive way of "raising the stakes on the game" in an attempt to "make things a little bit more interesting."
(Not only in terms of tech, but also in terms of ways of living popularized by celebrities, thought leaders, etc.)
For example, infinite scroll is a product of a news feed and a news feed is algorithmic. What this produces and what it reinforces in the user is one thing but not really related to some small grey text in an Amazon Prime sign up.
So let's break it down. Some of the issues are:
1. Intent to sign up.
2. Difficulty in cancelling a service. This is what I call the "gym model". Easy to sign up, hard to cancel. This can be handled. California, for example, requires companies to offer online cancellation. Most other states don't. This is so much an issue you'll regularly find advice from people to change their address to California so they get that option. There's no reason why every state or the federal government couldn't do that.
3. Selling of your data. Not really touched here but it's going to be a big issue going forward;
4. Addictive behavior to maximize time spent on platform; and
5. What should we allow or disallow for minors. This is going to be a big issue. We're only at the start of the Age Verification Era (like it or not). But IMHO no company should be talking about how to maximize time spent for 13 year olds. And no advertiser should be able to advertise to minors; and
6. Not really touched here but I'm going to add it anyway. IMHO we give tech companies a free pass for algorithms as some kind of mystical, neutral black box. But everything an "algorithm" does represents a decision humans made to get a certain behavior from what training data is used, what they're optimizing for (eg interactions or time spent) and what features they create.
Platforms now essentially get liability protection from publishing content even though they elevate or suppress content based on what it contains. IMHO this is no different than someone deciding what to publish and being liable for it.
There is no market if you have no mechanism for price discovery, no meaningful alternatives, users are addicted, confused, and simply unable to switch.
Things WERE better before we combined skinner boxes with ad tech, before we preyed on users and applied every trick in the book to entrap them.
“We respect your privacy” banner, with a big green ok button and a “manage data collection” tiny print text that had consent for everything automatically approved
Before they existed websites would just put stuff on your computer without asking. They’re literally a consumer protection.
Direct your outrage elsewhere.
I think you're being condescending though, and missing the point.
Just like people who will complain about a news site with ads or some other unrelated design feature of the site they don’t like.
Again, if you’re on here you presumably know how to block ads, or cookie dialogs.
> But but but <argument I am mocking>
> Shhh! <People I don't agree with> will hear you!
> It's almost as if <sarcastic oversimplification>.
> Tell me you <don't understand topic> without telling me you <don't understand topic>.
Social media is not making you behave in ways you don't want. On the contrary, it's giving you EXACTLY what you want. People want to doomscroll social media instead of engage reality, because the real world requires action, effort and social risk...doomscrolling is pure passive consumption.
If we're going to give people autonomy and freedom to choose how they spend their time, at some point we have to draw the line and hold people accountable for their own actions. Or we have to acknowledge we'd rather stay in a permanent state of adolescence and give full control of our lives to big brother.
This constant push by the urban monoculture to turn everything into an "addiction" and turn everyone into a "victim" is a terrible set of ideas to put in peoples heads and is equally as toxic as anything they claim smartphone apps are trivially doing with UI design.
Apps are not physically addictive like cigarettes or alcohol and never have been.
And if you're going to argue social media preys on reward systems in the brain, this is also true about everything that humans do. Reward systems in the brain govern every single action we take, so everything we do can turned into a victimization by some addictive outside force.
If I'm making bicycle wheels, I want all my customers thinking "these are the best bicycle wheels. I don't want them from any other supplier ever again and actually I want some that I don't even need just in case." I want them up at night thinking about how great my bicycle wheels are, looking at pictures of them on their phones.
I'm not sure how people are squaring the circle where companies are supposed to meet market demand by giving people what they want, but, uh, "not like that." If a product people want is really that bad for them, vote for the government to regulate it. We've read this story before.
The market is the thing we create, and its effective functioning is what competition and regulation is meant to enable. It is through the functioning of the market that effective resource allocation occurs.
> A market economy is meant to generate the best allocation of resources and the biggest benefits for consumers. For these promises to be fulfilled, consumers must be able to see and choose alternatives deliberately; compare them on undistorted dimensions; form preferences that reflect actual interests; and switch freely. Cognitive exploitation undermines all four of these. Infinite scroll captures attention. Dark patterns distort comparison. Dopaminergic loops manufacture compulsion. Addiction engineering blocks effective switching.
> Securities regulation offers an instructive analogy. When a trader manipulates stock or derivatives prices, the law treats the crime as a structural harm to the broader market; the corrupted price no longer tells the truth. Cognitive exploitation should be seen in the same light, at a much larger scale. When platforms systematically manufacture the preferences of billions of users, consumer signals no longer point anywhere useful. That is a structural failure.
I can say with certainty that opioids are addictive. I can also say with certainty that doomscrolling is pretty far on the opposite end of that spectrum. I have yet to meet someone who would steal copper pipes off of an abandoned building or sell their body on the street for a few scrolls of tiktok.
But why do you get out of bed at all in the morning? What drives you to exist...are those reward systems in the brain addictive? Why are you sitting at your keyboard right now arguing with a random stranger on the internet?
Are you procrastinating something else you should be doing instead...and is that Hackernews' fault or yours?
You'd like the goalposts to sit closer so its easier to offload responsibility onto abstract external entities.
I'm arguing this doesn't change who has to be the one to close the app, shut off the TV, turn off the video game, close the bag of candy and take risks in the real world.
I occasionally play a perpetually-in-alpha AAA+ game (I won't name it to avoid the flames) that recently asked users to fill out a questionnaire. At no point did it ask how they could make my time spent in the game more fun or awesome. They did explicitly ask, "What can we do to make you spend more time in game?". The focus was clearly on quantity, not quality. This made me realize that, perhaps, I should stop playing this game.
Social media and games use all sorts of dark patterns and engagement bait to keep you clicking, but no concern is given to giving back. There is a complete absence of awareness that the best forms of entertainment enrich and then end. If they were to provide an amazing but brief experience that changes regularly, people would come back again and again. They don't need to spend hours on it every single day to feel they're getting value and justify opening their wallets. Doom-scrolling and spending excessive time grinding in games will only make you feel stressed out and unfulfilled. Customers need to realize this and start voting with their wallets for experiences that end.
We need to turn things around and say, "The light that burns half as long burns twice as bright!"
Make their data junk.
Unfortunately, games keep getting longer and longer with more and more filler. The problem is that many gamers complain loudly when games are short. There are comparatively few games that buck the trend. Now, I play very few games as a result.
But you're right comparatively few games give you "more game" vs "more filler"
For example, I liked factorio because I was always struggling with the "current paradigm" and trying to get to the "next paradigm". For example, coal powered furnaces vs electric furnaces. Or conveyor belts vs trains.
But some games just make themselves longer by just turning the knobs on grinding.
I guess this is like novels vs short stories. And we are seeing the same sorts of things where the story arc is stretched over a long series of books and content is fluffed up a bit.
I miss those days where the measure of success was having someone play the game at all and enjoy it, not how long they might be locked into your product to the detriment of anything else in life.
One tactic against Fortnite has been to say "Yes, you can play on the console this evening, but not Fortnite and instead choose from one of these games." That at least encouraged him to play through Horizon Zero Dawn, as an example.
I think that’s where League gets you as well. New champions. New items. Oops, just redid the skill tree. Oh hey, balance changes. All while you’re trying to go up in rank. It feels like work — and has in fact become a job for some —, but can be incredibly fun and addictive.
I’m glad I’m out of it, and instead get to play my good ol’ Steam collection.
I'm a noob, the depth of the game is still unfathomable to me.
Every time I sit down to play, the game feels richer, more nuanced. The minds I encounter striving for the same joy of mastery come alive and reveal themselves - you'd be amazed how much personality can come through mouse clicks.
I take better care of myself because if I'm in a bad mood or mindset, I won't be able to play good dota. It's a litmus test for my current mental and emotional state.
Oh, and it keeps my ego in check: If you're playing for yourself, you will lose. It's better to work together on a suboptimal plan than announce you're correct and sabotage the team. Humility complements skill.
I find most other games boring - I don't play for story, I play for depth and nuance of mechanics, for connoisseurship and mastery. Most video games are on-rails; may as well watch netflix for all it demands of you. Having a wide variety of colored candies does not make a diet.
You get out of games what you put in, in my opinion.
Fair, although what you get out of League is a lot of stress and abusive teammates. Of course you also get tons of fun and challenging matches that make it worth it. But…
I have about 8k hours on Factorio, and about half that in Terraria. Another 5k in Civilization V. Now: are they more enjoyable and enriching than a MOBA? Do they have better mechanics and provide more happy memories? Yes on all counts.
Of course we all enjoy different things :) And hey, in DotA you have a really cool sniper!