Optimize You CPVLab Pt.2

Hi Guys,

Yes, it’s been a while, I’ve been moving around a lot, and frankly haven’t been all too inspired to post anything lately. I have a lot I can share with the AM community, but since it’s full of blood-thirsty money-hungry dishonesty, I gotta be careful and keep some goodies to myself.

Enough of the B/S – Here I provide you with some tips to keep your CPVLab DB running a little smoother.

First thing you will need is SSH access on your server. Don’t know what that is? Then this is probably too advanced for you. However, if you’re on a managed server, I’m sure your hosting company can have a techie help you out with this one.

After digging around I found some recommendations and these settings seem to have improved performance for me. You’ll want to pay attention and ensure that these settings are ideal for YOUR server, as every server is different, especially given the different RAM allocation. My *new* server that I am currently moving to has 5gb of RAM allocated so my innodb_buffer_pool_size is equal to 2gb (or 2048M).

Tweaking InnoDB Settings

These are just the settings I’ve discovered so far to help speed things up. Feel free to add your findings in the comments.

Firstly, you’ll need to edit the following file on your server: “/etc/my.cnf”

Once logged in via SSH:

  • Type “edit /etc/my.cnf”
  • Add or Change the following settings (Just be aware, these are settings that I USED for my server. Speak with your hosting company if unsure on which settings to use or do some of your own research. Typically, the one you should be most concerned with is innodb_buffer_pool_size
[mysqld]
thread_cache_size=16K
table_cache=128M
local-infile=0
innodb_file_per_table=1
max_connections=100
key_buffer_size=16M
join_buffer_size=1M
query_cache_type=1
query_cache_limit=4M
query_cache_size=128M
max_tmp_tables=1
default-storage-engine=MyISAM
innodb_buffer_pool_size=2048M
innodb_flush_method=O_DIRECT
low_priority_updates=1
concurrent_insert=ALWAYS
tmp_table_size=128M
max_heap_table_size=128M
  • Restart MySQL
  • /etc/rc.d/init.d/mysql restart

 

Automate Defragmentation via Cron & Bash

This next step is going to automate defragmenting your tables which can help speed things up also.

Firstly, you’ll need to create a new “bash” script on your server to be run via Cron later.
Lets name this file “dbdefrag.sh”

Type: “edit dbdfrag.sh”and paste the following code, modifying the user/pass for your DB accordingly.

#!/bin/bash

MYSQL_LOGIN='-u USERNAME  --password=PASSWORD'

for db in $(echo "SHOW DATABASES;" | mysql $MYSQL_LOGIN | grep -v -e "Database"$
do
        TABLES=$(echo "USE $db; SHOW TABLES;" | mysql $MYSQL_LOGIN |  grep -v T$
        echo "Switching to database $db"
        for table in $TABLES
        do
                echo -n " * Optimizing table $table ... "
                echo "USE $db; ALTER TABLE $table ENGINE=INNODB" | mysql $MYSQL$
                echo "done."
        done
done

 

The next step is scheduling with cron.

Steps:

  • crontab -e
  • Go to the end of the file and add the following line (note it runs at 4am every day)
  • 0 4 * * * bash dbdefrag.sh
  • CTRL+X to exit then hit Y to save
  • Alternatively, you can set up a cron job via cPanel

 

Disclaimer

Please consult your server admin before making any of these changes to yours. I take no responsibility if you mess your server up. Perhaps do these changes on an experimental install or server first before making the changes to your live one.


If you’re looking for further Affiliate Marketing Guidance, check out StackThatMoney. Best community of experienced marketers from around the globe, exclusive meetups, follow along’s, tutorials and the knowledge of a thousand sun-gods.

-=-=-=-

Can’t Decide on Tracking Software?

I’ve recently switched over to a new tracking platform called Thrive by the guys over at iPyxel which I love. It’s still in development, but is constantly improving and making strides, and the best part about it can be self-hosted. The offer a 30-day trial and it’s $99 a month thereafter which is well worth the investment.

Those on a smaller budget can still opt to go the CPVLab route, another favorite of mine but a little more outdated. It is, however, more suitable for PPV traffic if that’s your traffic of choice.


Read More

Optimize Your CPVLab Database!

I’d like to apologize up-front for the sheer lack of attention to this blog as of late. I’ve been lazy, but mostly busy, and in transit quite a lot this year. I’ve finally returned to my home soil to get a bunch of work done, relax, and plan my return to the Canadian Rockies for some more adventure (and other traveling of course!). In the year that has just passed I’ve spent most my time living in Calgary, Alberta, Canada and have had the pleasure to visit Bangkok, Phuket, Tokyo, Osaka, Vancouver, Montreal, Las Vegas,  The Grand Canyon, Miami, Anaheim, Hollywood, San Diego, San Francisco, Memphis, Nashville, New Orleans, Colorado, New York & The Bahamas and do a bunch of crazy-amazing stuff during that time (Including ASW and ASE!)

What I’m trying to get at is.. well.. I’ve been BUSY!

Enough of my personal life and onto the meaty stuff, eh?!

Recently I’ve had a lot of performance problems with my CPVLab installation. I’ve had my server upgraded, both hardware and software, but just wasn’t happy with the results. On top of that, I knew that my database was getting WAY too bloated for it’s own good, but I was afraid to cull my stats without a backup!

Unfortunately that led me to have problems logging in today and so I decided to fix the problem myself and would like to share how I did it so that others can enjoy a more reliable and speedy CPVLab! Anyone that runs a decent amount of PPV traffic (or any traffic really) will know that the data builds up relatively fast and can bloat your DB into performance-hell.

Here’s some simple steps on how to fix it!

Step 1 – Backup Your Data

  • Log in to your phpMyAdmin console (usually via cPanel on your server)
  • Find your CPVLab database (usually named _cpvlab) in the left-hand panel. Click on this.
  • Click on Operations
  • Under the panel listed as “Copy Database to:”, you’ll see a text-field. Enter an appropriate name in here. I chose, _cpvlab-bkp-dd-mm-yy
  • Hit Go

Step 2 – Clean Your Clicks Data

  • Log into your CPVLab and navigate to Settings >> Stats Management
  • From here, either select the campaign you wish to cull clicks from, or select “All Campaigns”
  • Select the date range from what you wish to cull (I do anything over 3 months old, and 1 month for campaigns with lots of data).
  • Click Save

Step 3 – Advanced Optimization

  • Within phpMyAdmin, go to your _cpvlab database once again as mentioned in Step 1.
  • In the right-hand panel, you’ll see a list of tables (affiliatesources, alerts etc..). Find the table names clicks and well.. click on it.
  • Go to Operations
  • Down the bottom left you’ll see “Table Management”
  • Click Defragment
  • Don’t click anything else. I tried to use the Optimize function and it messed with my DB.

 

This should help speed up those cluttered DB’s and hopefully improve your ROI!

 

– Andrew


If you’re looking for further Affiliate Marketing Guidance, check out StackThatMoney. Best community of experienced marketers from around the globe, exclusive meetups, follow along’s, tutorials and the knowledge of a thousand sun-gods.

-=-=-=-

Can’t Decide on Tracking Software?

I’ve recently switched over to a new tracking platform called Thrive by the guys over at iPyxel which I love. It’s still in development, but is constantly improving and making strides, and the best part about it can be self-hosted. The offer a 30-day trial and it’s $99 a month thereafter which is well worth the investment.

Those on a smaller budget can still opt to go the CPVLab route, another favorite of mine but a little more outdated. It is, however, more suitable for PPV traffic if that’s your traffic of choice.


Read More