Hacker Newsnew | past | comments | ask | show | jobs | submit | buster's commentslogin

As far as I know sqlite has such tests and probably others.

Performance tests aren't unusual. But sometimes things get slower out of necessity. It's impossible for a test to automatically distinguish between intentional and unintentional slowdowns. At some point you have to have someone make a judgment call about updating the test to accept the new state of things. Or draw a hard line and say things are never allowed to get slower no matter what, but that can be a tough goal to maintain.

But that's not what is in the whole context. The whole context contains a lot of noise and false "thoughts". What the AI needs to do is to document the software project in an efficient manner without duplication. That's not what this tool is doing. I question the value in storing all the crap in git.


May be the point is, that the one engineer replaces 10 engineers by using the dark factory which by definition doesn't need humans.


The great hope of CEOs everywhere.


And then he get replaced by a new hire when he asks for a raise.


Domain driven design?


I assume is this one https://www.gnu.org/software/ddd/

I used it back in Uni, in 98, and it really helped me to understand debuggers. After it, even using gdb made sense.


I was so confused. Why is domain driven design especially good for debugging? I guess context is bound within the models... And then all the other comments were just talking about debugging tools. Glad I was not the only one.


That or the Kagi Browser... Waiting for a Linux release.


More Webkit :/


Just curious, what don't you like about Webkit?


It's the IE of the 21st century.

Browser monoculture is bad for the open web and if all we have is Webkit (Safari on iOS, Macs) and its fork Blink for all the Chromium browsers, then the web will start becoming a mess of proprietary extensions instead of open standards.


>It's the IE of the 21st century

I see this claim often. As someone who learned web dev during the days of IE dominance, I don't understand it.

Internet Explorer never kept up, especially after IE6 reigned supreme. They weren't "a little behind" or didn't have some more niche APIs missing or implemented in a buggy or proprietary way. It actively ignored standards, it didn't receive real updates for a long time (IE11 being the fruition of what the best they could offer was) and generally with few exceptions (namely, the invention of CSS Grid and XMLHttpRequest) generally degraded the ecosystem for over a decade. It actively held back companies from adopting new web standards. Its why polyfilling became as proliferated as it is now.

Safari / WebKit has not induced any of this. Yes, sometimes Safari lags behind in ways that are frustrating. Yes, sometimes Apple refuses to implement an entire API for political rather than technical reasons (see the FileSystem API), but largely it has managed to stay up to date with standards in a reasonable time frame.

While their missing or subset implemented APIs can feel really frustrating, they haven't actively held back any work nor the mass adoption of newer browser APIs.

Apple has their faults, but this isn't even close to the drudgery that was the IE heyday era.


Sounds amazing, but i would rather be able to run the database locally and use the same in dev as in production. Is this possible?


PlanetScale's Postgres offering is as close to plain-old-Postgres as we could possibly build.


It's just a hosted Postgres database, you could run this locally with Docker for example: https://hub.docker.com/_/postgres


I've not used Claude Claude yet, but why would it be bad if it gains features that people use? Did people ever complain about Photoshop to have too many features demanding some cognitive load? Excel? Practically every IDE out there? There is a reason people use those tools instead of the plain text editor or paint. It's for power users and people will become power users of AI as well. Some will forever stick to chatgpt and some will use an ever increasing ecosystem of tools.


because devs will have no clue how their systems work, the only ones who do will be LLMs, gatekept behind an ever-increasing cost-per-usage.


It is a very significant consideration for every one of those tools. The introduction of the "ribbon" in Excel was moderately controversial in 2007.

The default tools made available in Photoshop is why it remains on top to this day.


good question. the difference with AI tools is the interface isn't stable in the same way photoshop or excel is. with traditional software you learn it once and muscle memory carries you. with LLM tools the model itself changes, the optimal prompting style shifts, features interact with model behavior in unpredictable ways. so the cognitive load compounds differently. not saying features are bad, just that the tradeoffs are different


Isn't it the same wisdom as to avoid cyclic dependencies?


It is not only that. An acyclic graph can be non-planar, which means that as you add more nodes, the number of edges can grow as O(n^2).

A polytree is a planar graph, and the number of edges must grow linearly with the number of edges.


I don't know I tend to either come across new tools written in Rust, JavaScript or Python but relatively low amount of C. The times I see some "cargo install xyz" in a git repo of some new tool is definitely noticeable.


Awesome, love it!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: