#MetaGer 9 hashtags

Suma-EV die Betreiber von #metager haben geraden die gigantische Neuigkeit ver枚ffentlicht, dass bald der #EuropeanOpenWebIndex als Suchquelle genutzt werden kann. Geplant ist diesen kostenlos einzubinden. Ein guter Zeitpunkt die Suchmaschine zu wechseln oder gar Mitglied zu werden!

suma-ev.de/en/newsletter-archi

@MetaGer

#degoogle #unplugtrump

@kubikpixel besides using europe-hosted and ideally decentralized services, I also like to use specific search engines for specific tasks. Inspired by DDG's bangs, I created ser莽i to use bangs/keywords with my favorite search engine #MetaGer. you can #selfhost ser莽i on a tiny webhost or VPS. it's small, fast and minimal in terms of data usage! :)

@maexchen1 @pallenberg die kostenlose Suche von #MetaGer wurde nicht wegen usermangel geschlossen, sondern weil yahoo EU-Vertr盲ge aufk眉ndigte.

Die Token-basierte Version von Metager funktioniert aber noch, wird best盲ndig weiterentwickelt und hat erst k眉rzlich neue Suchmaschinen hinzugewonnen.

Looks like #MetaGer also implemented #Google queries via https://serper.dev/ now. It costs only 1 token per request, which is the same as #Brave and #Mojeek. Let's see how the search results turn out with this.

Two new changes in my Meta虏 Search Engine #serci:

1. Due to #MetaGer being forced to disable their advertisement financed gratis offering, the default search engine is now france-based #Qwant
2. Users are now able to choose an individual default search engine via a drop-down from all configured search engines. This will set a cookie in your browser.

So, if you (like me) want to continue using @metager@suma-ev.social via the paid tokens (or support them via a membership, which I would urge you to do), you can set it as your personal default search engine in ser膲i again :)

Change-Logs for these two ser膲i changes:
1. https://src.jayvii.de/pub/serci/commit/8634b8ee623f1b7abfc1815ea8aab53a643683af.html
2. https://src.jayvii.de/pub/serci/commit/9b9da13251d2e1fec60631846fc3d22c29c08df6.html

After a few performance improvements, I did some loading time tests for #serci, between my home network (wifi) and a random VPN where the service is running.

echo "html,redirect" | tee timestamps.csv
for i in $(seq 1 1 100); do
    HTML=$(curl -o /dev/null -s -w '%{time_total}' https://search.jayvii.de)
    REDI=$(curl -o /dev/null -s -w '%{time_total}' https://search.jayvii.de?q=test)
    echo "$HTML,$REDI" | tee --append timestamps.csv
done


Across 100 runs for loading the site's HTML (generated from pure #PHP) versus a redirect to a chosen service (here the default #MetaGer), I can measure on average 0.28s for loading the frontend (HTML) and 0.13s for processing input and issuing the redirect.

I am quite happy with this relatively low overhead, although performance may decrease a little if more services are added (currently: 47). At some point maybe an #sqlite database may be more efficient than my pre-constructed #json files which are loaded on-demand.

Lately, I've been using @MetaGer #MetaGer as my main #SearchEngine, but had been really missing bangs (known from #DuckDuckGo and other search engines).

So this morning I threw together a short #PHP site that I could use as search front, which handles bangs and redirects me to the correct search engine (is this meta-meta? ;D):
https://search.jayvii.de/

If you find bugs or want to contribute, feel free to check out my git repo and contribute:
https://src.jayvii.de/pub/traserci

Both #DuckDuckDuckGo as well as #Qwant seem to be full in on #AI powered summaries of search results. With every year, online search seems to slip more into #enshittification with #SEO madness and #AI generated content and searches.

Time to pick up other search engines like #MetaGer, #mojeek or #SearX