← Back to context

Comment by rvnx

17 hours ago

Very nice to see that this project is hand-crafted and not AI-generated like 99% of the submissions here

So, congrats on your release.

When I clicked I already thought about the comments that ask "is this vibe coded". So I kind of asked myself that question. As someone who manually codes as well as experiments with AI-assisted coding I ask myself what attitude we should develop towards AI-assisted coding in the long run. Right now on HN it almost seems like "AI shaming" at work. If you post a project that's a result of using AI you can expect a lot of critique around here. While I understand that to a certain extent I guess we also need to overcome that sentiment. After all we don't blame people using IDEs, code completion or other tools that have become the norm.

  • It would be similar to me posting “Show HN: I built a turbo encabulator in Rust” and I actually hired coders from Craigslist to bring my idea to life.

    • To modify the popular phrase: "If you didn't bother to code it, why would I bother reviewing it?"

      Providing feedback to an author is only valuable if the author at least knows why they did something, so you can discuss how it's good or how it's not.

  • > After all we don't blame people using IDEs, code completion or other tools that have become the norm.

    Because those don’t have the same issues. It’s not like IDEs, LSPs, and other tools were the target of warranted criticism and then we stopped. Rather, they never received this kind of backlash in the first place.

    No IDE has ever caused millions of people absolutely unrelated to it to have to ration water.

    https://archive.ph/20250731222011/https://m.economictimes.co...

    To use an exaggerated analogy, it’s like saying “people are complaining about arsenic being added to food but we need to overcome that sentiment, after all we don’t blame people adding salt and pepper which have become the norm”.

    • If that's the reason why people dunk on ai-assisted programming, fine.

      That's not the impression I had though, the criticism I usually see is around laziness, low-effort submissions, etc... Which are not inherent issues of using llms.

      2 replies →

    • TLDR: That article is pretty low quality, and the "caused millions of people absolutely unrelated to it to have to ration water" doesn't seem like a reasonable conclusion. It's not mentioned at all in the source article. I took some notes on this article and traced back the research to the original article by The Austin Chronicle which is significantly better: https://www.austinchronicle.com/news/2025-07-25/texas-is-sti... , would recommend.

      Main takeaways:

      - Why are we building data centres so close to the equator where it's hot.

      - It's depressing to see the high quality reporting from The Austin Chronicle watered down into more and more clickbaity soundbytes as it gets recycled through other "news" orgs. But at the same time, I wouldn't have heard about it otherwise.

      - The water evaporation was interesting to me, and would love to read more on what percentage evaporates, and whether the Stargate plans to build non-evaporative cooling will actually hold out and how that'll impact the water grid.

      - Would love some more info/context on that 463 mil number, but stopping my research here for now. Combining this with when/how often Texas has to ration its water would provide a stronger argument in support/against the provided claim of water rationing.

      - The fact that we don't have good numbers for how much water data centres are using is crazy, we need that level of granularity/regulation.

      - Markers of poor reporting:

      - Numbers without context/clarity. Would it kill these sites to include a bar chart.

      - Citations of sites that market engaging/entertaining

      - Ambiguous / contradictory data

      - Ambiguous references

      Notes:

      Interesting article! A few weird things:

      1. The most cited reference is to a site called "Techie + Gamers", which self-describes itself as "TechieGamers.com is a leading destination for engaging entertainment coverage, news, net worths and TV shows with a strong focus on Shark Tank." Makes me suspicious of the journalistic quality of this and that article.

      2. In the headline it says "Texas AI centers guzzle 463 million gallons". Further down it says "According to a July 2025 investigation by The Austin Chronicle, data centers across Central Texas, including Microsoft and US Army Corps facilities in San Antonio, used a combined 463 million gallons of water in 2023 and 2024 alone, as reported by Techie + Gamers." Over 2023 and 2024? Odd that it's giving the sum over 2 years. And not sure what it means that it includes the US Army? Also without any context I don't know what this number means.

      - I checked the TechieGamers article and this contradicts what is written there, which says the 463 million number is for San Antonio alone.

      3. Robert Mace, executive director of The Meadows Center for Water, notes that "once water evaporates, it's gone." This is interesting, not sure how much water is actually evaporated vs returned to the grid.

      4. "The scale of water use is massive, as the Texas Water Development Board projections estimate that data centers in the state will consume 49 billion gallons of water in 2025, soaring to nearly 400 billion gallons by 2030, as per Techie + Gamers report. That’s about 7% of Texas’s total projected water use, according to the report." - Mixed citations here, not sure whether these numbers are from Texas Water Development Board or Techie + Gamers. Also they project an increase from ~232 million gallons/year in 2024 to 49 billion in 2025? That's a 200x increase. And they expect a further ~8x increase from 2025 to 2030 to 400 billion? Or is it because the original number was only for Central Texas?

      - 7% of what? The 2025 number or the 2030 number?

      - Again subtle contradictions with TechieGamer which says "a white paper submitted to the Texas Water Development Board projected that data centers in the state will consume 49 billion gallons of water in 2025. That number is expected to rise to 399 billion gallons by 2030, nearly 7% of the state’s total projected water use.". So it's not the Texas Water Development Board but a whitepaper submitted to the board? Not sure who made these numbers now.

      5. "Much of the water these centers use evaporates during cooling and can’t be recycled, a critical issue in an area already grappling with scarce water resources, as reported by Techie + Gamers."

      - Again really want more info/numbers on this.

      The root article seems to be from "The Austin Chronicle" :

      1. This starts with "After Donald Trump and Elon Musk’s public breakup, Sam Altman replaced Musk as the president’s new favorite tech guy. Altman, the CEO of OpenAI, has become something like Musk’s archnemesis on the rapidly developing stage of artificial intelligence in Texas." This doesn't seem accurate with my reading of the news, and is so colourful that it makes me question the journalistic quality of this article.

      2. The reporting across the three sources is mixed on who they're blaming. Economic Times doesn't even mention OpenAI and calls it "Microsoft's Stargate campus". Techi Gamers uses this phrase, but also later says "Microsoft has partnered with OpenAI". And The Austin Chronicle doesn't mention Microsoft at all and focuses on OpenAI. And the Wikipedia page for Stargate says "joint venture created by OpenAI, SoftBank, Oracle, and investment firm MGX." ?

      3. I take it back reading it further this article is _significantly_ better than the others, with many more reputable sources.

      4. Finally we get some real sources!! The 49 billion 2025 and 400 billion 2030 numbers are from HARC, Houston Advanced Research Center. And the 7% is actually 6.6%, and relative to the 2030 projection.

      5. Finally real info on evaporation!! Still no numbers but we get a description of the process:

      > Most data centers use an evaporative cooling system, in which the servers’ heat is absorbed by water. The heat is then removed from the water through evaporation, causing the water to be lost as vapor in the air. The cooler water then goes back through the machines, and this loop is regularly topped off with fresh water. After all, evaporation renders the water saltier and unusable after four or five cycles. “Then they dump the water, and it goes down the sewer,” Mace said.

      > ...

      > The Abilene Stargate campus will reportedly use a closed-loop, non-evaporative liquid cooling system that requires an initial refill of around 1 million gallons of water, with “minor” maintenance refills. Cook is skeptical that such closed-loop systems will use as little water as they suggest. It’s not possible, Cook says, to use the same water over and over again, recycled infinitely, to cool servers.

      6. This article doesn't mention the 463 mil anywhere, which makes me think that was original research from TechiGamers. They reference SAWS, San Antonio Water System, but again the numbers are without context, so would need to do some original research to get any meaningful insights from these numbers.

  • If I can tell something is "vibe coded", that means it's bad. It doesn't matter what tools people use as long as the output is good. Vibe coding smells include:

    1. Tons of pointless comments outlining trivial low-level behaviour,

    2. No understanding of abstraction levels,

    3. No real architecture at all,

    4. Not DRY, no helper functions or inconsistent use of said functions across project,

    5. Way too many lines of code.

    None of these are shaming for use of any particular tool, they are just shaming the output.

    • Ok, let's better not talk about "vibe coding" because we don't really have definition of what it means. "Historically" it means "just letting the AI code without looking at its output" while I often see people that are more diligently using AI using it kind of tongue in cheek. My mistake using the expression in the latter way.

  • >Right now on HN it almost seems like "AI shaming" at work.

    HN leans "old school". It's the Angry Nerd trope; Comic Book Guy from the Simpsons.

    The people doing "AI shaming" or claiming that "AI doesn't work" are going to have their lunch eaten.

    • > "AI doesn't work" are going to have their lunch eaten.

      Please, stop it. AI doesn't work for every use case, and when it "works" it fails to stay up to the exaggerated hype. People with deep knowledge and experience will eat their lunch. LLMs knowledge is stale as soon as a new version comes out and no rag hallucinate too. It's a tool and a tool doesn't have appetite

Thanks! Although I had to use it for some things (like the logo, for example, and I’m not a "graphic guy"), in the end, since it’s a simple project by design, I didn’t mind, and the result isn’t bad at all.

It's really odd now that we look for more human code rather than AI Generated code, and I think this is going to be increasing in every form of data that's out there.

Genuinely why do you care?

  • This may not be entirely the right metaphor but I kinda see it as the difference between fast food, a top rated restaurant, and home made cooking —with fast food being AI.

    Generic, does the job, not the highest quality, bleak, fast repetitious output

    • While I agree, because I like writing code, I do wonder if this is how assembly writers felt when automated compilation started to take off.

    • >not the highest quality

      I bet AI writes better code than 80% of developers out there.

      But all developers think they are in the 20%.