The Internet was a great resource for sharing and pooling human knowledge.
Now generative AI has come along to dilute knowledge in a great sea of excrement. Humans have to hunt through the shit to find knowledge.
To be fair, humans were already diluting it in a great sea of excrement, the robots just came to take our job and do it even faster and better.
Ok.
> uses search engine
> search engine gives generative AI answer
God dammit
> scroll down
> click search result
> AI Generated article
> search engine gives generative AI answer
> It cites it source, so can’t be that bad right?
> click link to source
> It’s an AI generated article
Oh no.
AI will give the correct, real source and then still make shit up. Bing linked to bulbapedia to tell me wailord was the heaviest Pokemon. Bulbapedia knows it isn’t close, bingpt doesn’t know shit.
It’s funny because I’ve also used LLM for getting useful info about pokemon, and it didn’t make any sense.
It’s fantastic at templating
Just don’t trust what it provides the template
Use udm14.org.
Legend.
There’s also udm14.com if you want to have cheeky fun with it.
When search engines stop being shit, I will.
deleted by creator
¯\_(ツ)_/¯
To make a pie, you’ll need a pastry crust, a filling, and a baking dish. Here’s a basic guide:
Ingredients:
For the pie crust:
2 1/2 cups all-purpose flour
1 teaspoon salt
1 cup (2 sticks) unsalted butter, cold and cut into small pieces
1/2 cup ice water
For the filling (example - apple pie):
6 cups peeled and sliced apples (Granny Smith or Honeycrisp work well)
1/2 cup sugar
1/4 cup all-purpose flour
1 teaspoon ground cinnamon
1/2 teaspoon ground nutmeg
1/4 teaspoon salt
2 tablespoons butter, cut into small pieces
Instructions:
- Make the pie crust:
Mix dry ingredients:
In a large bowl, whisk together flour and salt.
Cut in butter:
Add cold butter pieces and use a pastry cutter or two knives to cut the butter into the flour mixture until it resembles coarse crumbs with pea-sized pieces.
Add water:
Gradually add ice water, mixing until the dough just comes together. Be careful not to overmix.
Form dough:
Gather the dough into a ball, wrap it in plastic wrap, and refrigerate for at least 30 minutes.
- Prepare the filling:
Mix ingredients: In a large bowl, combine apple slices, sugar, flour, cinnamon, nutmeg, and salt. Toss to coat evenly.
- Assemble the pie:
Roll out the dough: On a lightly floured surface, roll out the chilled dough to a 12-inch circle.
Transfer to pie plate: Carefully transfer the dough to a 9-inch pie plate and trim the edges.
Add filling: Pour the apple filling into the pie crust, mounding slightly in the center.
Dot with butter: Sprinkle the butter pieces on top of the filling.
Crimp edges: Fold the edges of the dough over the filling, crimping to seal.
Cut slits: Make a few small slits in the top of the crust to allow steam to escape.
- Bake:
Preheat oven: Preheat oven to 375°F (190°C).
Bake: Bake the pie for 45-50 minutes, or until the crust is golden brown and the filling is bubbling.
Cool: Let the pie cool completely before serving.
Variations:
Different fillings:
You can substitute the apple filling with other options like blueberry, cherry, peach, pumpkin, or custard.
Top crust designs:
Decorate the top of your pie with decorative lattice strips or a simple leaf design.
Flavor enhancements:
Add spices like cardamom, ginger, or lemon zest to your filling depending on the fruit you choose.
No.
I ask GPT for random junk all the time. If it’s important, I’ll double-check the results. I take any response with a grain of salt, though.
You are spending more time and effort doing that than you would googling old fashioned way. And if you don’t check, you might as well throwing magic 8-ball, less damage to the environment, same accuracy
The latest GPT does search the internet to generate a response, so it’s currently a middleman to a search engine.
No it doesn’t. It incorporates unknown number of words from the internet into a machine which only purpose is to sound like a human. It’s an insanely complicated machine, but the truthfulness of the response not only never considered, but also is impossible to take as a deaired result.
And the fact that so many people aren’t equipped to recognise it behind the way it talks could be buffling, but also very consistent with other choices humanity takes regularly.False.
deleted by creator
And some of those citations and quotes will be completely false and randomly generated, but they will sound very believable, so you don’t know truth from random fiction until you check every single one of them. At which point you should ask yourself why did you add unneccessary step of burning small portion of the rainforest to ask random word generator for stuff, when you could not do that and look for sources directly, saving that much time and energy
I, too, get the feeling, that the RoI is not there with LLM. Being able to include “site:” or “ext:” are more efficient.
I just made another test: Kaba, just googling kaba gets you a german wiki article, explaining it means KAkao + BAnana
chatgpt: It is the combination of the first syllables of KAkao and BEutel - Beutel is bag in german.
It just made up the important part. On top of chatgpt says Kaba is a famous product in many countries, I am sure it is not.
deleted by creator
LLMs are great at cutting through noise
Even that is not true. It doesn’t have aforementioned criteria for truth, you can’t make it have one.
LLMs are great at generating noise that humans have hard time distinguishing from a text. Nothing else. There are indeed applications for it, but due to human nature, people think that since the text looks like something coherent, information contained will also be reliable, which is very, very dangerous.deleted by creator
deleted by creator
You do have this issue, you can’t not have this issue, your LLM, no matter how big the model is and how much tooling you use, does not have criteria for truth. The fact that you made this invisible for you is worse, so much worse.
deleted by creator
So, if it isn’t important, you just want an answer, and you don’t care whether it’s correct or not?
The same can be said about the search results. For search results, you have to use your brain to determine what is correct and what is not. Now imagine for a moment if you were to use those same brain cells to determine if the AI needs a check.
AI is just another way to process the search results, that happens to give you the correct answer up front, most of the time. If you go blindly trust it, that’s on you.
With the search results, you know what the sources are. With AI, you don’t.
Who else is going to aggregate those recipes for me without having to scroll past ads a personal blog bs?
Not LLM, that’s for sure
Tell me you’re not using them without telling me you’re not using them.
Thd fuck do you mean without telling? I am very explicitly telling you that I don’t use them, and I’m very openly telling you that you also shouldn’t
I use them hundreds of times daily. I’m 3-5x more productive thanks to them. I’m incorporating them into the products I’m building to help make others who use the platform more productive.
Why the heck should I not use them? They are an excellent tool for so many tasks, and if you don’t stay on top of their use, in many fields you will fall irrecoverably behind.
It works great for me
cool logical fallacy you allow to rule your life
OK lol
Start using SearXNG.
deleted by creator
Can you briefly explain how this works? Do you have a link or something similar?
There are many projects just search for clones of perplexity most use searxng + llms. I used one recently called yokingma / Search_with_ai But there are others
Thanks for the new rabbit hole! 😁
brother eww
searX still uses the same search engines.
Yes, however, using a public SearXNG instance makes your searches effectively private, since it’s the server doing them, not you. It also does not use generative AI to produce the results, and won’t until or unless the ability for normal searches is removed.
And at that point, you can just disable that engine for searching.
from a privacy perspective…
you might as well use a vpn or tor. same thing.Yes, but that’s not the only benefit to it. It’s a metasearch engine, meaning it searches all the individual sites you ask for, and combines the results into one page. This makes it more akin to DDG, but it doesn’t just use one search provider.
it’s a fantastic metasearch engine. but also people frequently dont configure it to its max potential IMO . one common mishap is the frequent default setting of sending queries to google. 💩