Hacker Newsnew | past | comments | ask | show | jobs | submit | massysett's commentslogin

These customers own expensive cars - or at least, cars that were expensive when they were new. The car might now be ten years old or more, and the owner bought it used. They want a prestige marque, but the customer does not have the money to buy a new prestige car. So they are looking to save on service.

All the time I see cars with expensive names - BMW, Mercedes Benz - broken down on the side of the road, while old Hondas and Toyotas keep cruising by. Those are the customers for this shop: they spent all their money buying an expensive used car, and now they can't afford to maintain it and fix looming problems; meanwhile the Toyota or Hyundai driver gets maintenance and maybe even takes it to the dealer for it.

A mechanic like this can't afford to hire someone to answer the phone. Such a person is expensive, and these customers want rock-bottom prices despite the car being expensive. So a chatbot is good enough and better than nothing.


The most trustworthy mechanic I used in England had an appointment book pretty much full for four months in advance. He didn't answer the phone, didn't have a computer, just a desk diary. If you wanted him to work on your car you turned up at his workshop and spoke to him. If you were willing to wait until he'd finished whatever thing he was doing he'd take a quick look at your car and suggest a course of action. And despite his full order book if something looked urgent enough and small enough he'd fit you in quite quickly.

He charged reasonable prices, but definitely not rock bottom. He had no need to compete with the bottom feeders because every customer acted as his public relations agent.

How would a chatbot help?


Ok, presume it is. Why is this a useful observation? The author still needed to poke and prod the LLM to produce useful information. She still needed to know what questions to ask and prompts to give, and hopefully steered it right when it made up falsehoods.

I’ve used CL for years and the layered model fits with my experience yet I never conceived of it exactly that way. It’s useful. So what if an LLM wrote it?


If it helps, the article “evolved” so I don’t really care that LLM’s had a part to play. I am setting up a development environment for Mezzano, the Common Lisp OS after getting it running on ARM64. I needed to understand the full CL toolchain to build an AI agent harness that could talk to Mezzano.

I figured out I could do this via SWANK. But kept hitting the same problem, the information about how all the pieces fit together is scattered across dozens of sources and nobody as far as I can tell had put a complete layered map in one place. Which I kind of already had from all the conversations and research I’ve been doing so I glommed it all together and posted it to r/lisp.

BTW the lisp community have been really helpful so I incorporated and continue to add all the corrections and pointers people have been giving. Case in point someone above pointed out vend which is an interesting approach that might be useful for my lisp harness project.


When I read an article, I am expecting to read the author's own experiences and insights they gained from them. Not the regurgitation of an industrial scale word generator.

> She still needed to know what questions to ask and prompts to give

Then publish the prompts. Let me enter them in an LLM of my choosing and see what bullshit it hallucinates and diff it against the 'article'.

> hopefully steered it right when it made up falsehoods.

"Hopefully"? Publishing something a stochastic parrot dreamed up under your name is ghost writing at best and spreading misinformation at worst.


The "insight" that I needed a map, and that I had effectively created a map from my research, reading and "prompting" was mine, but I have no problem with using fancy tooling to help me pull it all together.

If someone could've pointed me to some other fully laid out mapping of the CL tooling stack I would've been happy as the article was a rather time consuming side quest.


> something a stochastic parrot dreamed up

With more time and energy, human discovery and invention, the statistical mechanics backing the information digest will improve beyond any one human's lifetime internalization and idiosyncratic writings divined.


> will improve

If only I was capable of such divination.


What I do see is somewhat curated cache of what stochastic parrot dreamed of so I don’t have to burn tokens myself.

As I understand author is interested in the topic and didn’t simply publish total hallucinations.


Author here. Deeply interested but not an expert by any means happy to have saved anyone a few tokens. I have done my best to fact check the content and the people on r/lisp have contributed a ton of corrections that I incorporated into revised edits. Always welcome constructive inputs if you have spotted any mistakes let me know.

Hi well you see it doesn’t matter how many times you will repeat „I am not an expert I did it for myself and just sharing in case someone else would be interested”.

Assholes will come out of woodwork claiming only experts are allowed to post anything online.

My point is, stop being apologetic as it only eats your energy and DGAF about such comments as the top one I replied to.


Thank you! Point taken and appreciated. Time is better spent on producing better materials. I have made a short version of the post as the primary article being too long was a valid criticism.

Yeah, but Baby Bell would dispatch a technician to your house if needed.

What kind of Brother laser printer is this? If it’s Postscript or PCL or AirPrint you don’t need drivers.

Brother ship mystery meat linux "drivers" that have a PPD that sends your document through Ghostscript I guess to scrub PostScript that the printer doesn't support.

I tried just using a generic PPD from openprinting.org but that caused the printer to spit out a ream of pages printed with binary junk, so mystery drivers it is.


“IPv6 is waiting for adoption”

A major website sees over 46 percent of its traffic over ipv6. A major mobile operator has a network that runs entirely over ipv6.

This is not “waiting for adoption” so I stopped reading there.

https://www.google.com/intl/en/ipv6/statistics.html

https://www.internetsociety.org/deploy360/2014/case-study-t-...


You'd happily deploy a website for use by the general public on IPv6-only?

I use Devouring Goats on your straw man. It's Super Effective!

To be less glib: IPv6 is well-adopted. It's not universally adopted.


Until you would happily deploy a service on IPv6 only, I suggest that you're still dependent on IPv4.

I'll repeat myself:

  IPv6 is well-adopted. It's not universally adopted.

No. Which doesn’t prove the technology has not been adopted. The internet also consists of much more than public-facing websites. So what’s your point?

My point is that we're still dependent on IPv4. For all the progress IPv6 has made, no-one is willing to switch IPv4 off yet. Until we do, we're still constrained by all the problems IPv4 has.

Plenty of people are switching v4 off. Facebook run basically all of their datacenters without v4. T-Mobile USA use only v6 on their network. Thread only supports v6 in the first place

There are plenty of other places doing the same thing, but these examples alone should be sufficient to disprove "no-one is willing to turn v4 off".


I don’t get it either. The CMUCL compiler is named Python, no relation to the prominent language. Not sure that’s what this was about though.

That was his confusion.

Heh, I guess Apple needs to better use AI to review all the AI-written apps.


Easy: MacBook Air. The friend is asking this question, so that’s what they need. If they needed a MacBook Pro, they wouldn’t be asking this question. If they wanted to spend as little as possible, they would have already bought something cheap, like a PC or Chromebook or now this Neo, so they wouldn’t be asking this question.


However, with the recent Macbook Neo. I actually went ahead and recommended Neo. Especially to a friend of mine whose going into college soon and has asked me what they should buy.

Now the 8gb can be concern to some but not to many IMO. And I am also feeling just a bit optimistic that Apple will realize that the largest criticism of this product can be that it doesn't have 16GB otherwise even more people can buy so in the future, I expect 16 GB to be possible too (When Ram bubble finally bursts)


Few enough differences so that if I could get an old Studio Display at a discount, I would. But right now it seems the old one is still full price where it's available.


Yeah, because it had a government-sanctioned monopoly.

https://en.wikipedia.org/wiki/Kingsbury_Commitment


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: