It was close to a year ago I moved this blog, it’s predecessor, and some of my old vintage 1990s home page from servers I maintained when I was at Maricopa. Before I left, the old “Jade” server was running, and I set up some htaccess redirects to send requests to their proper new places. nice and clean.

Well, the problem is that the IT folkd back at Maricopa, apparently yanked the machine (no one was there to likely even bug them), and my own blog here had lots of links and image references that pointed to a 404 server.

Fixing this was not all that complex, but took a few steps. The first was covering my _____ by going into phpMyAdmin, in my WordPress database, and making a backup copy of the wp_posts table as wp_posts_backup (via the operations tab). I still downloaded a SQL version of the contents, but this is an easy way to revert if I mess up; I’d just have to rename the copy table.

So next, I get the SQL contents- via the Export tab, using the options for Save As File, and in addition, checking the Drop Table option, which means when re-imported, it will wipe out the old table, and create a brand new copy. This file is just a long, long series of MySQL statements, that can recreate the database table, and sequentially, re-insert each row of data..

So after this is done, I have a 4.2 Mb text file on my computer. I use the best searchg and replace text editor around, for me, it is BBEdit, and I run a few replacements on the whole file… replace with (this blog); with (I had a bunch of image references in that old subdirectory); and lastly all with (the newer home of Feed2JS). Also, on a Mac, it is important to make sure the text files have unix line return characters, not Mac, so on a re-upload, it is interpreted right by the server… a job for the indispensable LineBreak, an OS X freeware that can batch convert text files between Mac, Unix, and DOS line return formats.

So now I had an updated SQL file, but faced this situation– at 4.2 Mb, it exceeded the 2 Mb limit in phpMyAdmin (actually the PHP settings on the server). So I had to chop the file into 3 pieces, and run each one sequentially through the SQL tab where you c an select a text file to run a series of commands– essentially uploading the files that contained instructions to first drop the old database, create a new one, and then reinsert all the data.

So with that, I cleaned up about 1000 bad URLs. Not bad.

The post "Blog URL Cleaning" was originally dropped like a smoking hot potato at CogDogBlog ( on November 18, 2006.

Comments are closed.