Saturday, 22 June 2019 17:34

Bots and your site and some more

So running over statistics my heart nearly took a hit because I thought my site was slow and alot of online users were displayed in the Current Sessions statistics in the footer of my website. Got happy and thought "Hey, either I must upgrade or the site would suffer from being underdimensioned in connection capacity making everything slower", this was until I looked closer into who were online and saw that SEMRush were indexing my site. This is a good thing, however sending appr. 50.000 sessions simultanously isn't really pleasant to my real users and therefor I blocked the bots running on the site to get a real feeling of what is what. I both blocked via .htaccess (anothe,r link2) and PHP to be safe sure.

Here are the entries for .htaccess and your index.php file if you'd like to block SEMRush (remove the tags to regain getting indexed, SEMRush can be a valuable asset in your webmaster toolbox of source sites where you can analyze data and it can even help you get ideas to improve your website):

.htaccess entry

BrowserMatchNoCase SemrushBot bad_bot
Order Deny,Allow
Deny from env=bad_bot

PHP entry inside the index.php home file

<?php
if(preg_match('/SemrushBot/i',$_SERVER['HTTP_USER_AGENT']))
{
header('HTTP/1.0 403 Forbidden');
echo "Blocked temporary!";
exit();
}
?>

While I do agree that bots are a must to index the whole thing, there's really no need to have hundreds of bots using the site at the same time on the same site. Bots likes to surf porn? ;-D This is my own "failure" though, since I'm using another tactic instead of related feeder sites, all domains that I use for this network are being *redirected* to midnight12.com to serve what people are looking for when they enter one of the network domains. 90% of my traffic are visitor type-in traffic originating from good keyword domains. Bad thing is the address URI changes in the browser but that's really the only downside to it. It's the best setup to gain traffic level besides doing onsite SEO and keeping to the strategy of building more sites with content related to the niches that are targeted. I will optimize the code to be more friendly if someone is looking into copy parts of the website for their own sake. Copyright wise I really don't mind if ie. a footer link or sourcecode link <!-- source: domain.com --> appears somewhere if you're happy with the result otherwise just keep the source private and see how far you can bring it ;-)

Online life right now.

Pornhub.com made some cool changes I noticed, logo changed and thought I've just got myself a friend, not sure if it has something to do with me visiting their very large library of porn or if they are just happy to have friends around on the large premise of the internet. I'm trying to promote content that suits my own personal taste (this being a big mistake because the site would get more traffic if it hosted all niches), perfect content does exist but is really really rare, mostly only few streams per website if your even that lucky. If I started importing content from 3rd party feeds there should be enough for all sexualities the coming 1001 years. This would be a bad idea because my server produces timeouts at around 60.000 simultanious connections and getting indexed for that much content would produce a traffic level that would make midinght12.com too slow for users to be happy surfing it. 

The chance of hundreds of bots on your site is however slightly unreal because you probably don't have 40 related top-level .com .dk .net domains pointing to your site with A Record <Apache Server bound IP > and CNAME www linkedsite.com.
In Chrome Browser this produces a nice redirect that doesn't make your domain take a hit but instead just making a refferer on the target site logs. The theory is "The more the merrier" and site wise this suits me perfect because there's no need to have alot of sites promoting a product/niche showing shared content even though this means more content space as in more place for advertisements. Will see where it ends someday. Right now I'm testing theories gathered from years of surfing the internet.

Content must be unique to every site in your network to be home safe. This also protects you from "Duplicate content" which in reality is "Duplicate tags content". Google is a META Search Engine and looks into your content to make their algorithm produce SERP results.
Duplicate content could be <title>, <h1> and <meta> while <body> is the Search Engine´s Metadata to sort out relevancy, in between subpages on the same domain. If the same code is mentioned on another site it's something else which is called Hype. Hype is a very good thing and only seldom achieved, depends on ones network but also sometimes wallet size or product popularity, first to mill serves as father about the hyped content.

Bing is more relaxed as Google and they work slightly different, thanks for that, this is probably from being the youngstar in the group of Search Engines trying to finetune their algorithm to suit most of their users personality/personal preference/taste best, I guess this is what they call personalized content which normally runs per account like on Google. Before it was a voting/rating system that worked per-click ranking SERP results which Google Toolbar started to introduce (they were most likely the first to combine clicks/popularity with content Metadata and present them as a relavance search result, devided by keywords and present this too as a product, keyword ads/adwords). 

Storage, Streaming OS and some music.

Smartphones could learn from dualboot systems though they (producers like MotorolaNokiaApple, SamsungHTC, Huawei and the new pet Xaomei) also as ourselves like focus and else they are looking for commitment to a brand, customer care and giving an individual experience. Freedom of flavour for the ones with no concerns.

Actually I have a  project that is serving a bootable Operative System via netboot feature IPX. Just need some working clients that will be served centralized and then have a GUI giving the users the free choice of OS and flavour. Developers should be able to upload their ISO and working configuration to get listed on the boot GUI. Maybe through paying a fee. If a user needs a legal license to "play" the system then the system could use the working internet connection and paying with a creditcard option for what they need and store it per account. Flat-clients are used in businesses, they don't need to be able to play high-def games or read and store on really fast SSD discs. The thing I'm working with is however suited for both Desktop computers along with flat-clients only using local storage as a backup drive. It would bring all the available Operative Systems (all the different workspace/desktops) as networkboot option into all computers that support the IPX netboot feature, which in the end would mean a live system personalized catered to each enduser which I think is what Microsoft is aiming for building their OS entrance when installing, taking their Windows 8.1Pro installer into notice?

On fiber connections there will not be any speed issue. Fiber connections even work faster than a current-state harddrive. The only limit is the Wifi connection or cable connection to the desktop (cable being the slowest but most secure solution).

Streaming OS.

Future looks bright :-)

Music: https://www.youtube.com/watch?v=wuHmIDQQjW4

Published in Traffic

I've been at it for a while now (since 1995) and I must say the internet has really changed alot since day 1. Webmasters are on a daily quest to make money and vs. truth and good conversion ratio there is a gap as I discovered. To start with I have to say that ie. Facebook is doing a great job at what they do being a social media website gathering the masses in one spot. However this presents a security risk for the users because they would think "We don't want to share our credit card informations with a company that has leaked private informations on several occasions". In my belief this could be a matter of risk management. They could and I think they will begin to use a system that may be called "Pure" or whatever they like. This system has the user informations in several layers and are rewriting/replacing essential user informations with unreal/fake info without saying it's all a hoax, I only see the geographical position as a "keep it real". Max would be Thorvald and Maxine could be Francine etc.

The truth does not need to suffer by replacing private informations to almost still real but with different name, email address, private address, phone number into a socalled "Pure profile" which won't be as pure as a real, but close. Take it as a small gift to Facebook and lets enlighten Mark a bit. Your conversion ratio on websites are fair otherwise marketing companies would leave you alone and not buy advertising spots and you'd be left with almost no sales of ad spots. It would though be nice if you could tell people that you'r considering a system that is protecting privacy and stop leaking private informations to 3rd. party compaines by selling REAL informations for marketing purpose. All it takes on every of your sale is to copy the requested data onto a secured platform and start replacing essential (this can be done automated) private informations into something that cannot identify your users as who they are in the real world. Right now this works beautifully already on your own site because of all the fake accounts - but it would be good to see it full-scale to protect users and give the internet a better reputation instead of being called a liar because you're a webmaster as we as people really hate to lie about the truth. Export data into a second version of the database and still keep Facebook a really nice experience. The users would not need to know anything else than you have secured their data and you're now selling databases tailored for marketing purposes. I think this should be fairly easy to understand for everyone. I could also say keep statistics in mind keep geo data real but replace personal identifiers. It would not lose any value because the data is still usable for marketing purposes. Otherwise make it an option and let marketing companies opt-in for this second database, activating this system for advertisers. Just keep them aware of what they are about to experience etc.

Back to my own mini network, right now all sites are working almost as intented it just needs a small tweak as in getting my feeders live again and not redirect all traffic from my type-in domains to midnight12.com so people could get what they want, sites relating to webcams even though the tube is pretty awesome in my personal opinion. I try to manage it in the most professional way I can think of and that I can achieve with my current knowledge :-)

I hope you had some great holidays. See you again soon.

Published in Social Media