Tuesday, November 19, 2013

finding mysql configuration file

Sunday, October 20, 2013

Rooting Samsung Galaxy Grand i9082

1. Download and Install Odin v3.04.
2. Download SuperUser-1.25. (And put the zip in your SD Card)
3. Download CWM-touch_i9082_chotu.tar.

Steps :
1. Boot your phone in Download mode by holding Power + Vol. Down + Home Button together.
2. Start Odin on your PC and connect the phone through USB Cable.
3. UnSelect Auto Reboot in Odin. Select CWM tar file in Odin(PDA) (after unzipping the file in step 3. above).
4. Start in Odin.
5. Once it says Pass in Odin - remove the battery to power off the phone - also remove the USB cable.
6. Put the battery back and boot up in recovery mode by holding Power + Vol. Up + Home Button together.
7. Select - choose zip from sd card and choose your SuperUser Zip.
8. After done - reboot.
9. That's all.

Thursday, August 29, 2013

Error Code: 1205. Lock wait timeout exceeded; try restarting transaction

increase the value of innodb_lock_wait_timeout in my.cnf or my.ini file.
In XAMPP for Windows : location of the file is xampp/mysql/bin/my.ini

Tuesday, July 9, 2013

OpenFire message archives (needs fastpath and monitoring plugins)

Here are the relevant mysql tables : 

1. fpsessionmetadata
2. ofMucConversationLog
3. ofConversation
4. ofConParticipant
5. ofMessageArchive

Tuesday, June 11, 2013

unable to send mail through php mail function on CentOs (Amazon EC2)

$ service sendmail status
sendmail dead but subsys locked
sm-client (pid  17067) is running...

$netstat -lnp | grep :25
tcp        0      0 127.0.0.1:25                0.0.0.0:*                   LISTEN      1057/master

$ kill -9 1057
$ rm /var/lock/subsys/sendmail

$service sendmail stop
$service sendmail start
$service sendmail status
sendmail (pid  17288) is running...
sm-client (pid  17296) is running...


Saturday, May 18, 2013

Using Hiphop for PHP static analysis

1. Install hiphop on Ubuntu ( I just got an AWS EC2 instance) following these steps :
https://github.com/facebook/hiphop-php/wiki/Prebuilt-Packages-on-Ubuntu-12.04

2. Then run the hiphop command for static code analysis :
hhvm --hphp --target hhbc --input-list /tmp/files.list --output-file hhvm.hhbc.sq3

Building hiphop php on Centos 6.3(Didn't succeed)

I tried to follow the instructions here : http://stackoverflow.com/a/8132054/49560.
There were few things missing, which I am mentioning here.

1. Path of the third party patches has changed in the hiphop repo. So, find the correct path and then proceed. For me it was :

    cp /home/ec2-user/hiphop/hiphop-php/hphp/third_party/libevent-1.4.14.fb-changes.diff .
     cp /home/ec2-user/hiphop/hiphop-php/hphp/third_party/libcurl.fb-changes.diff .

2. The version/location of tbb has changed.
Get it from :  wget http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20130425oss_src.tgz

3. Before doing cmake .
A. yum install subversion
B. install google glog
D. yum install elfutils-libelf-devel
E. get libdwarf
 git clone git://libdwarf.git.sourceforge.net/gitroot/libdwarf/libdwarf
cd libdwarf/libdwarf
./configure
make
sudo cp libdwarf.a /usr/lib64/
sudo cp libdwarf.h /usr/include/
sudo cp dwarf.h /usr/include/
cd ../..

4. Now you do cmake . and then make. You may get an error saying that lock_guard is not a member of boost. So you need to replace boost::lock_guard with std::lock_guard in ThreadLocalDetail.h.

5. Now do make again. I got some errors which were beyond me - so giving up.

Saturday, May 4, 2013

Set up hadoop cluster on EC2

I followed : http://blog.cloudera.com/blog/2012/10/set-up-a-hadoophbase-cluster-on-ec2-in-about-an-hour/

There are few things missing here and there but it's a great help otherwise.

Here are the missing parts :

1. Get the EC2 command line tools :  wget http://s3.amazonaws.com/ec2-downloads/ec2-api-tools.zip.
Your EC2_HOME will point to the place where the above file will be unzipped.

2. Here are all the lines you will put in your ~/.bash_profile :

export AWS_ACCESS_KEY_ID=BKIAISLDPUEUJN2ILNTP
export AWS_SECRET_ACCESS_KEY=n+v7BZhFy5CwUqpC27C/q8/vJiz5+vy0YH4Z8yyV
export EC2_PRIVATE_KEY=/mnt/aws/aws-pk.pem
export EC2_CERT=/mnt/aws/aws-cert.pem
export EC2_HOME=/mnt/aws/ec2-api-tools-1.6.7.3/
export JAVA_HOME=/usr

3. After you launch your ubuntu instance - you need to go to AWS console and the corresponding security group - add rule for SSH - so that you can login to the box.

4. On the ubuntu box : put these 2 lines in .bashrc :
export AWS_ACCESS_KEY_ID=BKIAISLDPUEUJN2ILNTP
export AWS_SECRET_ACCESS_KEY=n+v7BZhFy5CwUqpC27C/q8/vJiz5+vy0YH4Z8yyV


5. Here is your hadoop.properties file for whirr :
whirr.cluster-name=whirrly
whirr.instance-templates=6 noop
whirr.provider=aws-ec2
whirr.identity=${env:AWS_ACCESS_KEY_ID}
whirr.credential=${env:AWS_SECRET_ACCESS_KEY}
whirr.cluster-user=huser
whirr.private-key-file=${sys:user.home}/.ssh/id_rsa
whirr.public-key-file=${sys:user.home}/.ssh/id_rsa.pub
whirr.env.repo=cdh4
whirr.hardware-id=m1.large
whirr.image-id=us-east-1/ami-1db20274
whirr.location-id=us-east-1


Friday, April 26, 2013

jquery autocomplete source function

For e.g., if the user types john smith in the inputBox, and you want to return all the elements which contain either john or smith or both, then use the following code : 
 
$( "#inputBox" ).autocomplete({
source: function (request,response) {
var terms = request.term.trim().replace(/\s+/g,' ').split(" "); //Now terms is an array ['john','smith']
var respData = {}; 
terms.forEach( function(term) { 
lookUpArray.forEach(  function (lookUpElem) {
var regexp = new RegExp(term, "gi");
if(lookUpElem.match(regexp)) {
respData[lookUpElem] = 1;

});
});
var results = [];
for(var k in respData) results.push(k);
response(results);
}
});

lookUpArray is a global variable here - if it contains ['john dao' ,'smith dao', 'john smith', 'singh bone'] - then the first 3 elements will be returned for the query john smith.

Friday, April 19, 2013

How can you not love PHP?

1. Once I wanted to set up something like Wikipedia on my own. I got hold of MediaWiki - written in PHP/MySql. Dumped it on my shared host. In less than an hour I was up and running.

2. Once I wanted to run an e commerce site. I sifted through half a dozen e commerce engines - all written in PHP/MySql. Dumped in on my shared host. In a couple of days I was up and running.

3. Once the mighty Google Reader closed down - for once and for all. I dumped Tiny Tiny RSS Reader on my shared host. In a couple of hours I was up and running.

How can I not love PHP  after all this and more?

Tiny Tiny RSS Reader : This XML Document is Invalid

I just came across this problem for this feed : http://techcircle.vccircle.com/feed/

The problem being the whitespaces at the beginning of the xml.
So we have to trim the response.

Here is the solution : 

1. Open the file lib/simplepie/simplepie.inc
2. Somewhere around the line #1342 - You will find : 

// Loop through each possible encoding, till we return something, or run out of possibilities
foreach ($encodings as $encoding)
{
// Change the encoding to UTF-8 (as we always use UTF-8 internally)
if ($utf8_data = $this->registry->call('Misc', 'change_encoding', array($this->raw_data, $encoding, 'UTF-8')))


Add the following line as the first statement in the foreach loop : 
$this->raw_data = trim($this->raw_data);

Now the code should look like : 

// Loop through each possible encoding, till we return something, or run out of possibilities
foreach ($encodings as $encoding)
{
$this->raw_data = trim($this->raw_data);
// Change the encoding to UTF-8 (as we always use UTF-8 internally)
if ($utf8_data = $this->registry->call('Misc', 'change_encoding', array($this->raw_data, $encoding, 'UTF-8')))

Wednesday, March 27, 2013

AWS Auto Scaling through UI

Only Third party tools exist : 

AWS Auto Scaling - Invalid Image Id Problem

If you get an error like : Invalid Image Id while creating a launch configuration : 
One possible reason is : 

1. That your AMI exists in a different region that the one you are creating your config for.

AWS Auto Scaling CLI

1. Step by Step Guide : http://docs.aws.amazon.com/AutoScaling/latest/GettingStartedGuide/SignUp.html
3. Summary of Config : 
In your .bash_profile (on CentOs 6.3), put the correct values for the following : 

export AWS_AUTO_SCALING_HOME=/mnt/aws/AutoScaling-1.0.61.2/
export JAVA_HOME=/usr/
export PATH=$PATH:$AWS_AUTO_SCALING_HOME/bin
export AWS_CREDENTIAL_FILE=/mnt/aws/myCredentialFile
export AWS_AUTO_SCALING_URL=https://autoscaling.ap-southeast-1.amazonaws.com

4. Format of the credentials File(/mnt/aws/myCredentialFile) :

AWSAccessKeyId=AKIAITFNHGOMJKLNJABC
AWSSecretKey=5aFNFbmXPNr/YYhEuFHBMX8czR+AjpdvrRE5nY6r

4a. Get your credentials from : https://portal.aws.amazon.com/gp/aws/securityCredentials#access_credentials

5. List of Other regions : 
http://docs.aws.amazon.com/general/latest/gr/rande.html#as_region


Monday, March 25, 2013

java

instantiating an object of an interface on the fly through anonymous classes : 
  IBlah blah = new IBlah() {      public void doBlah() {          System.out.println("Doing Blah");      }  };

Tuesday, March 19, 2013

Installing PHP Imagick extension on CentOs 6.3

yum install ImageMagick-devel
pecl install imagick
add extension=imagick.so in /etc/php.ini

Monday, March 18, 2013

Tiny Tiny RSS Reader - Unable to update through command line

I tried multiple things but I was unable to run update.php from command line.

Here is the solution.

I just copied the code required for feed updation and ran that script from command line. Here is the code :
Here is the cronjob command(if you are on hostmonster) :
wget complete_path_to update_feeds.php



update_feeds.php
#!/usr/bin/php53s
<?php
set_include_path(dirname(__FILE__) ."/include" . PATH_SEPARATOR .
                get_include_path());

define('DISABLE_SESSIONS', true);

chdir(dirname(__FILE__));

require_once "functions.php";
require_once "rssfuncs.php";
require_once "config.php";
require_once "sanity_check.php";
require_once "db.php";
require_once "db-prefs.php";

if (!defined('PHP_EXECUTABLE'))
define('PHP_EXECUTABLE', '/usr/bin/php53s');

// Create a database connection.
$link = db_connect(DB_HOST, DB_USER, DB_PASS, DB_NAME);

init_connection($link);
// Update all feeds needing a update.
update_daemon_common($link);

// Update feedbrowser
$count = update_feedbrowser_cache($link);
_debug("Feedbrowser updated, $count feeds processed.");

// Purge orphans and cleanup tags
purge_orphans($link, true);

$rc = cleanup_tags($link, 14, 50000);
_debug("Cleaned $rc cached tags.");

db_close($link);

if (file_exists(LOCK_DIRECTORY . "/$lock_filename"))
unlink(LOCK_DIRECTORY . "/$lock_filename");
?>

Sunday, March 17, 2013

Self Host Tiny Tiny RSS Reader

Steps : (Requires PHP 5.3 or above and MySql/PgSql)
If you are using HOSTMONSTER - here is how you can change your PHP version.

1. Get the tarball from http://tt-rss.org/redmine/projects/tt-rss/wiki
2. tar -xvzf <file> in your public_html/read
3. copy config.php.dist to config.php
4. Create a DB and update the info in config.php
5. source schema/ttrss_schema_mysql.sql into that DB.
6. In config.php - make SIMPLE_UPDATE_MODE true and SINGLE_USER_MODE to false.
6a. in the same file change SESSION_COOKIE_LIFETIME to 86400
7. Now login with admin/password

For those who are used to keyboard shortcuts j/k for prev/next article on Google Reader :
1. Enable the plugin googlereaderkeys in Preferences-> Plugins -> User Plugins -> googlereaderkeys


Disabling up/down arrow keys shortcuts :

1. Open include/functions.php
2. go to function get_hotkeys_map
3. comment these lines :

     "(38)|up" => "prev_article",
                                "(40)|down" => "next_article",




Wednesday, February 20, 2013

Friday, January 25, 2013

Friday, January 11, 2013

Monday, January 7, 2013

Dropbox not syncing due to bad file name

Recently my dropbox client on windows stopped syncing new files all of a sudden - whereas the same was working well on CentOs(Linux).
I tried : 
1. Uninstalling + Reinstalling the Windows client.
2. Uninstalling + Rebooting Windows + Reinstalling the Windows client.
3. Changing the location of the Dropbox folder on my machine.

But it didn't work because the real problem was presence of a colon(:) in my filename.

There are some characters - which if present in the filename -would prevent the syncing on Windows Clients.


Here is the list of your bad files(You need to be logged into dropbox to see this list).


So I renamed my old files like this :

for myfile in ~/Dropbox/*; do
    target=$(echo $myfile|sed -e 's/:/_/g')
    mv "$myfile" "$target"
done


Source.

Blog Archive