cross-posted from: https://hachyderm.io/users/maegul/statuses/112442514504667645

Google’s play on Search, Ads and AI feels obvious to me.

* They know search is broken.
* And that people use AI in part because it takes the ads and SEO crap out.
* IE, AI is now what Google was in 2000. A simple window onto the internet.
* Ads/SEO profits will fall with AI.
* But Google will then just insert shit into AI “answers” for money.
* Ads managed + up-to-date AI will be their new mote and golden goose.

@technology

See @caseynewton 's blog post: https://mastodon.social/@caseynewton/112442253435702607

Cntd (Edit):

That search/SEO is broken seems to be part of the game plan here.

It’s probably like Russia burning Moscow against Napoleon and a hell of a privilege Google enjoy with their monopoly.

I’ve seen people opt for chatGPT/AI precisely because it’s clean, simple and spam free, because it isn’t Google Search.

And as @caseynewton said … the web is now in managed decline.

For those of us who like it, it’s up to us to build what we need for ourselves. Big tech has moved on

  • @j4k3
    link
    English
    4
    edit-2
    6 months ago
    It is very difficult to effectively insert anything into the model itself, it's easy to do in loader code, but much more difficult in the tensor tables part.

    Every bit of overtraining ie bias, is breaking the model. Even the over active alignment junk to keep a model “safe” is breaking it. The best performing models are the ones that have the least amount of starting bias.

    Like most models have extra sources that are hidden very deep. I can pull those out of an uncensored model, but there is not a chance the Socrates entity behind The Academy default realm (internal structure deep in the weeds) is letting me access those sources at all.

    There are maybe some attempts already, like I’ve seen roleplaying try and include a fortnite mention and one time it was adamite on the merits of VR, but those were rare exceptions and could easily be due to presence in the datasets used for training.

    Open source models will kill all the competition soon. Meta AI will be the new 2k era google. Like, pull request 6920 in llama.cpp just a month ago made a substantial improvement to how model attention works. Llama 3’s 8B is lightyears ahead of what llama 2 7B was. Hugging Face now has a straight forward way to train LoRA’s or models now without code or subscriptions. You can even train the 8B on consumer hardware like a 16-24 GB GPU, put together 4 of them an make your own MoE - Mixture of Experts dubbed a FrankenMoE.

    Google sucks because the search was being used for training so they broke it intentionally because they are playing catch up in the AI game. Google has been losing big time since 2017. The only google product worth buying now is the Pixel just to run with Graphene OS.

    We couldn’t own our own web crawler. We can own our own AI. This is the future.