Jump to content

What Would You Like To See In Tomorrow's 2023 SearchEngine ?


TheStudent2023

Recommended Posts

Hard Women & Gentlemen,

What features would you like in a searchengine as a:

Searcher (buyer/shopper/consumer).
Website (vendor/seller).fed-up of the searchengines.

Searchengines have not changed since 1998. Have they ?
Let's brainstorm a new way of searching from now on so we can kiss good bye to nowadays seaerchengine.

Originally, we used to find links though link directories like Yahoo, Dmoz by clicking through categories and subcategories to find the right links.

Then alta-vista.com or webcrawler.com built the first searchengine where we started to search for keywords and the SE presented us the right ir wrong links.
Are you not fedup finding links this way now ? 25yrs is enough!
Let's think of other ways. New ways.

Are you not capable of thinking up better ways ? You are programmers! You do not need a business man to come up with a new solution. Now do you ?

In short:
What Would You Like To See In Tomorrow's 2023 SearchEngine ?

Link to comment
Share on other sites

Tough & Rough programmers (hard guys),

If you deem nothing is better than a searchengine then let us see if we can brainstorm a unique & better searchengine atleast. What do you say ?

1.

What features would you like in a searchengine as a Website (vendor/seller) ?

 

2.

What features would you like in a searchengine as a Searcher (buyer/shopper/consumer) ?

Link to comment
Share on other sites

Brainstormers,

Let me make the first move. Then, if you get any ideas, you may chime in.

**My SEARCHENGINE FEATURE SUGGESTIONS AS A CONSUMER/SEARCHER/VISITOR/USER**

1. I would like to see the SERPS listing LIKES,LOVES, DISLIKES, HATES of each link listed on the SERP.

2. I prefer to see SERPs listing fone numbers of the websites so we can call the websites. This way, we fo not need to clickover to the website and search for the contactus page to find their phone numbers.

(SE can charge websites on a pay per call model).

3. Same as B above. But this time it's the chat page link listed.

4. SERPS should list how many visitors the listed links on the SERPS currently have. (How many visitors the searchengine has currently sent). Now you can  clickover to the most busiest link of them all as that tells you it's popular than all the other sites listed on the SERP.

5. After you click links on the SERPS and visit a website, you need to scroll down the page to find to the content (link description) that was mentioned on the SERP. Better for my browser auto scroll me to the appropriate part of the page and highlighted the content that was listed on the SERP.

 

You know what, there are too many things I would want the likes of google to have but I won't bore you anymore unless this thread gets a life of it's own. It's your turn next.

Edited by TheStudent2023
Link to comment
Share on other sites

1 hour ago, TheStudent2023 said:

Searchengines have not changed since 1998. Have they ?

Search engines have changed a lot since 1998.  They are about the change a lot again thanks to things like ChatGPT I suspect.

  • Like 1
Link to comment
Share on other sites

This is what I have in mind:

A. As a loyal user of the searchengine, you should earn in some way from your search activities.

B. As a listed website, your links should earn in some way, even from activities of:

1. non-buyers;
2. back button hitters;
3. competing links.

 

Now let us revolve our features around these 2 ideas and thinkup new features for a new searchengine.

 

Edited by TheStudent2023
Link to comment
Share on other sites

Fellow Programming Buddies,

Howabout we get brainstorming unique features for a new searchengine and each of us can work on our own thoughtup feature and share the codes here for everyone's benefit.
You never know, this forum might even adapt to our way.

Ofcourse, the codes should have no strings attached. 100% free.
Let’s have fun shall we ?
Let us think of any some features that will make it more WOW than the current big boys. Yes ?

Just imagine, your own way of searching. Your own searchengine. Your own baby. Your own PAL!

I am interested! What about you ? Yes, YOU!

Edited by TheStudent2023
Link to comment
Share on other sites

@requinix

What does your Smiley mean ?

Anyways, bing.com uses ChatGpt, based on the youtuibe vid lastnight that I watched.
Leaving aside ChatGPT or AI stuff, what do you miss in a searchengine ?

And, you reckon ChatGpt will kill the searechengines ?

 

Link to comment
Share on other sites

Pro Online Businessers & Technicians,

My php searchengine finished. But my web crawler is not. Building 2 versions of the crawler for 2 different environments.
Basically, on my searchengine, I will add a link for you to submit your website. No big deal.
Yes, I know, I know. Since my SE will be unknown, it will be foolish for me to feel over the moon expecting every website to know me and find their way to my website and get submitting their links. Have my index built that way. That is why been busy last few days building the php crawler. With your helpS, ofcourse. Lol!
Wisely decided that, my web crawler will not be botting the web the old fashion way, where I point it to a link and it crawls all domestic links and find it's way to other domains. Messy. Best to feed it a map.
Hence, going to program the crawler to not wander off to other websites other than the domain I set it to.
So, at the beginning, I will manually first set it to one domain's Xml Sitemap. To one url only. Then, it will stay on that domain extracting & crawling all domestic pages.
Then, it will move-on to the next domain's Xml Sitemap that is on the list I feed it. And do the same. In likewise fashion.
Here, I got a question for you:How can I find the Xml Sitemap links of all websites in existence ?
What is your method to find this out ?
I have my own method. I will relate to you what it is. You tell me, if it is orthodox or foolish. And, whether you got  a more efficient way to crawl the web or not other than what I planning on doing.

You see, I had in mind to run my own dns cache. That way, I get hold of all the domains that are active across the world. But dealing with BIND is too technical for me. Linux is not my stuff! So, do you know of any Windows OS freeware/shareware/GPL/ etc. ones instead ? If so, are you willing to teach me how to set it up to achieve my purpose ?
If not, then I will have no choice but to buy the domains list from here:
https://domains-monitor.com/domainzones/
(Note, no affiliate links. Plus, it's not my website).

After I have downloaded all the active domains, I will use the techniques mentioned in the following tutorial links on how to find a website's Sitemap.

https://seocrawl.com/en/how-to-find-a-sitemap

I will program the php crawler to use those techniques to generate urls (possible Sitemap urls for each Domain).
And then will have to program the php crawler to navigate to the urls it generated to see if the generated links are valid or not. I will open a new thread in the php section, to get help how to write php code for a crawler to test whether a url is live or dead (exists or not).
It is not that hard for me to build a .exe crawler (desktop software) for Windows. And so, while I am still learning how to build a web version one (.php), I might aswell get the desktop crawler crawl the web and harvest links. Then, when my website is up & running, I can then upload the urls list to my website to build the searchengine Index.
Originally, I did not want to build a .exe crawler as I did not want to have my home computer on 24/7. Thought best, I get the php crawler, which'll be installed on my paid hosted vps. But, I got tonnes of internet data saved on my fone sim. Guess how much data ?
I buy data in GB every week. Actually, for 2-3yrs been buying 10GB/wk. Sometimes they give another 10GB/wk bonus. It only costs me around $3USD/£2GBP/2.5Euros a wk. In your countries, how much will it cost you to buy 10GB and does your ISP provide you another 100% bonus ? How much MB/GB you use every wk ? Just you personally and not your whole hsehold.
On youtube, I actually spend about 4GB/wk out of 10GB/wk. So, if I renew the pack again then my saved data (6GB) gets rolled over. That is how, I managed to save 100GB. But once I forgot to renew and wham! I lost all that 100GB!
This happened thrice to me in one year. I think in 2021. So, I lost 300GB that got saved rolling over each week!
Been careful lately. For nearly 1.5yrs and have lost no rolled over data. Guess how much I managed to save of these rolled over datas this time ? Let me check my cell fone. One moment ...

**1036946.37MB**

So, that is approx 1TB.
So how much data you managed to get rolled over (unuased data) like this on your mobile phone sim ?
few weeks ago, I lost another 50GB approx of bonus data that I did not finish using as they made changes to their plans to finish using the bonuses they gave lately or lose them. Silly sods!
Anyway, ave how much is a website in MB/GB/TB ? Let me calc how many websites ave my .exe desktop crawler will be able to spider before I run out of all saved rolled over data.
We do have broad band. Household users use that to browse youtube most. Got internet on mobile to check for watsap messages while on road. I planning on stop buying it now and save money. Quit buying the data bundle on sim will immediately result in me losing out on the 1TB data that got rolled over. Hence, planning on burning some ones & zeros to harvest links for my SE index using the desktop crawler. Get it to make use of all the saved data on my mobile sim. Else, been saving for nothing! When that data is finished, then can make use of the php crawler to run on my webhost's side. Good idea ? Yes or no ?

Any advice, tips, tricks, unorthodox ventures you would like to suggest ?

Thanks for reading my searchengine building history.

 

Edited by TheStudent2023
Link to comment
Share on other sites

@kicken

You know of any good BIND alternatives for Windows as I got to run my own dns cache just to download all active domains and their email addresses in the zones.

And someone just suggested to me an AI crawler. I might aswell check what all the fuss is about when they add "AI" onto a crawler's name and see if I can beat them to build a better desktop crawler when I start building my own .exe crawler.

If you know of any better AI crawlers and simpler freeware ones then let me know too. be they .exe desktop ones or php ones.

 

Edited by TheStudent2023
Link to comment
Share on other sites

22 minutes ago, TheStudent2023 said:

I got to run my own dns cache just to download all active domains and their email addresses in the zones.

You cannot just download a list of all domains and email addresses.  All a DNS cache does is save DNS lookup results to avoid hitting the upstream servers for common lookups.  Windows has one built-in already (see: ipconfig /displaydns).

  • Thanks 1
Link to comment
Share on other sites

1 minute ago, kicken said:

You cannot just download a list of all domains and email addresses.  All a DNS cache does is save DNS lookup results to avoid hitting the upstream servers for common lookups.  Windows has one built-in already (see: ipconfig /displaydns).

@kicken

Mmm. So, how can I run my own dns server to download all domains then if caching is not the answer ?

Actually, you are right. The cached dns server will only cache the ones users request for dns resolving. Silly me!

Link to comment
Share on other sites

@barand

 

Oops! Sorry! I just noticed your age in your logo. Usually, people do not disclose their age. So, I will go easy on you. Thought you were probably in your mid 50's. Did not know you were a pensioner! You a few yrs older than my old man/

Anyway, take care!

Edited by TheStudent2023
Link to comment
Share on other sites

20 minutes ago, TheStudent2023 said:

What is your advice ? How can I get hold of all active domains in the world for free ? have you ever tried it yourself ?

My advice is to not bother with it.  After a brief search, I found that apparently you can request copies of zone files, but I'd guess your chances of being granted them is low as it's not a services intended for the general public.

Link to comment
Share on other sites

21 hours ago, kicken said:

My advice is to not bother with it.  After a brief search, I found that apparently you can request copies of zone files, but I'd guess your chances of being granted them is low as it's not a services intended for the general public.

Mmm.

How do you reckon these folks get hold of the Zone Files everyday more than once ?

https://domains-monitor.com/domainzones/

Maybe, I stop procrastinating trying to do everything all  y myself and outsource the job. Meaning, buy the lists from that website. What would you do if you were in my position where you need a list of all the domain names ?

And if you were running your own searchengine, what you yourself would do so your Index is not empty or very limited with links ? After reading my plan above, you reckon I am on the right track or you suspect I getting derailed somewhere ?

Link to comment
Share on other sites


Wise geeks,

Am I correct to assume the following or not ?

The robot.txt directives are too messy. Best not bother programming a crawler to deal with it.

After-all, Xml Sitemaps are built for crawlers. They will only list those links the site wants crawled.  In that case, no need for any crawler to deal with robots.txt file to learn which pages to not index. To cut the chase, cralwer can only index the links found listed in the SiteMaps.

Edited by TheStudent2023
Link to comment
Share on other sites

8 hours ago, TheStudent2023 said:

How do you reckon these folks get hold of the Zone Files everyday more than once

They likely pay various places for the data then consolidate it.

8 hours ago, TheStudent2023 said:

What would you do if you were in my position where you need a list of all the domain names ?

I'd probably buy it.  I've bought compiled lists of US zip codes with various other associated data (Timezone, Lat, Lng, etc) before.  $35 and 5 minutes to buy and import the data was a lot cheaper than spending hours/days/weeks trying to compile it on my own.

 

  • Thanks 1
Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.