http://sed.sourceforge.net/sed1line.txt
This blog is to share my experience about what I am learning, troubleshooting in my day-to-day life while working on Linux Servers.
Thursday, April 30, 2009
Wednesday, April 22, 2009
Now its easy to deploy
Let Capistrano do the heavy lifting for you. It is designed with repeatability in mind, letting you easily and reliably automate tasks that used to require login after login and a small army of custom shell scripts.
Monday, April 13, 2009
copy large number of small files over the network - the fastest method
Wednesday, April 8, 2009
Getting login password from Apache authentication using PHP
$auth= $headers['Authorization'];
$autha = explode(' ',$auth);
$pair = base64_decode($autha[1]);
$authb = explode(':',$pair);
echo $login = $authb[0];
echo '\n';
echo $pass = $authb[1];
include this in php tags :)
Tuesday, April 7, 2009
OFC standards
http://www.thefoa.org/tech/connID.htm
http://en.wikipedia.org/wiki/Optical_fiber_connector
Friday, April 3, 2009
Redirect Using Iptables
iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT --to-port 8080
Thursday, April 2, 2009
HowTo get a small sample dataset from a mysql database using mysqldump
Here is a quick tip that will show how you can get a small sample dataset from a mysql database using mysqldump. We frequently need to get a small snapshot from a very big production database to import it into a development or staging database that will not need all the original data; let’s say we need 1,000,000 records from all the tables in the database; we will just use the option –where=”true LIMIT X”, with X the number of records we want mysqldump to stop after.
Simply we will run something like (add whatever other options you need to mysqldump):
mysqldump --opt --where="true LIMIT 1000000" mydb > mydb1M.sql
and this will get 1M records from each of the tables in the database. If you want this for a single table you would use something like this:
mysqldump --opt --where="true LIMIT 1000000" mydb mytable > mydb_mytable_1M.sql
To restore this, you would use the same as on a regular dump:
mysql -p mydb_stage < mydb1M.sql
This will give you a small number of records that you can use for development, testing, etc. whatever you would need.
To find all the files of a particular size.
Let’s assume you are searching for all the files of exactly 6579 bytes size inside the home directory. You will just have to run something like:
find /home/ -type f -size 6579c -exec ls {} \;
As units you can use:
* b - for 512-byte blocks (this is the default if no suffix is used)
* c - for bytes
* w - for two-byte words
* k - for Kilobytes (units of 1024 bytes)
* M - for Megabytes (units of 1048576 bytes)
* G - for Gigabytes (units of 1073741824 bytes)
You can search for exact file size, or just for bigger (+) or smaller (-) files. For example all bigger than 512k files would be found with something like:
find /home/ -type f -size +512k -exec ls -lh {} \;
I have added here -lh to the ls output so it will actually show the files with their sizes in a nice human readable format. Similar for smaller files you would use -size -512k.
Change password in single command
This will set the password redhat for user test.
Wednesday, April 1, 2009
Type Casting column type in Postgres
If you will do this :
alter table tablename
you will get this error:
ERROR: column "column_name" cannot be cast to type "pg_catalog.int4"
So, you need to do this:
alter TABLE