Cheap VPS Hosting Promos
VPS Coupon Codes, Reviews, Tutorials

Knowledgebase

How to Get 100% GTmetrix Scores and 100% PageSpeed Insights-Waikey

How to Get 100% GTmetrix Scores and 100% PageSpeed Insights

Many webmasters test their websites performance with GTmetrix and Google PageSpeed Insights and want to get higher scores, especially for those wordpress sites since the loading speed of wordpress is very slow if without any optimization. Lets check the speed snapshots of my site first: You see that I got 100% scores for 2 tools, and yes I'm very satisfying with this. In this article I will teach you how to optimize and speed up your wordpress site to get high scores just like me. My English sucks so let's make it simple and go straight to the point. Hosting:…

Views(2399)
Wordpress: How to hide/encrypt outbound/affiliate links without plugin-Waikey

Wordpress: How to hide/encrypt outbound/affiliate links without plugin

DEMO: Please check any posts on my blog, and click the links in the post, you will find that all the outbound links been encrypted with base64 and redirected with a doorpage: https://www.waikey.com/go/ Let's go straight to the point, here in this article I will teach you how to encrypt and hide all outbound links (affiliate links) without plugins. add these codes on the bottom 1) for the links in the post add_filter('the_content','link_jump',999); function link_jump($content){ preg_match_all('/<a(.*?)href="(.*?)"(.*?)>/',$content,$matches); if($matches){ foreach($matches[2] as $val){ if(strpos($val,'://')!==false && strpos($val,home_url())===false && !preg_match('/\.(jpg|jepg|png|ico|bmp|gif|tiff)/i',$val) && !preg_match('/(ed2k|thunder|Flashget|flashget|qqdl):\/\//i',$val)){ $content=str_replace("href=\"$val\"", "href=\"".home_url()."/go/".base64_encode($val)."\" rel=\"nofollow\" target=\"_blank\"",$content); } } } return $content; } 2) for the…

Views(2396)
How to Optimize WordPress Robots.txt for SEO-Waikey

How to Optimize WordPress Robots.txt for SEO

[lwptoc] What is Robots txt WordPress? Robots.txt is a text file which allows a website to provide instructions to web crawling bots. Search engines like Google use these web crawlers, sometimes called web robots, to archive and categorize websites. Mosts bots are configured to search for a robots.txt file on the server before it reads any other file from the website. It does this to see if a website’s owner has some special instructions on how to crawl and index their site. Why Should Use robots. txt Having a robots.txt file isn't crucial for a lot of websites, especially small…

Views(3347)

How to Solve [-bash-4.2#] Error in SSH Terminal Prompt

The problem Usually, the SSH terminal prompt should start with [name@host ~] $, but I got an error when I login with my SSH and it start with bash-4.2$. How to solve Every account is using bash, but it is configured to show your username, hostname, and path in the shell prompt. Shells are very configurable. So if you saw the bash-4.2$ instead of your username, there should been 2 files missed under /root: .bash_profile .bashrc Just copy these 2 files back to /root: cp /etc/skel/.bashrc /root/ cp /etc/skel/.bash_profile /root/ and the problem should be solved!…

Views(3709)