Powered By Blogger

Tuesday, June 14, 2016

Why Join Toptal network, not because it's easy because it's challenging!!!

I have been in the world of for 8.5 years since my college pass out. It's been a roller coaster ride since joining on 2007.
I started with technology Mainframe which was chosen by my company and since then I worked throughout on the same technology and learned one component after another. After 6 years working on the same technology and I found there is not enough scope of opportunity and innovation as the technology is very old and not getting updated.
Then I had my own hunger for learning new technology and I started with Java, then python and finally when I learnt Hadoop it's like pinnacle. The Hadoop technology rather I should call it a Tool based platform combines beauty with speed.
In order to improve my knowledge after I completed my external training on hadoop and I started finding freelancing project. While I was trying my hand on small projects here and there I's still hungry and was searching more when I came across toptal Software freelancers network. Which I find interesting as well as challenging and challenging work is something like more than anything. The initial information/details which I find in the website is quite interesting. Therefore I decided to join the network for some challenging work and learning opportunity.

In a nutshell I'm here for a) Learning b) Challenging opportunity c ) Obviously money if it comes additional.

Wednesday, May 18, 2016

SQOOP saved job and incremental import

I have executed the following queries in order to have an incremental append in my hive table "customers" from the mysql table "CUSTOMERS".

--- Sqoop command to create a saved job for the required import ----
sqoop job --create myIncreImportjob -- import --connect jdbc:mysql://localhost/training --username training --password training \
--table CUSTOMERS --hive-import --hive-table customers --incremental append --check-column ID --last-value 4 --hive-table customers --fields-terminated-by ',' ;

--- Command for executing the saved job
sqoop job --exec myIncreImportjob;


Observations:
1. If you don't mention the --hive-import, the data will be imported to directory with the mysql table name.
2. If we don't mention the   --last-value, entire table data will be imported.
3. Even if the saved JOB has the hard-coded value 4 as last value, next time the value is overridden by the last value (from ID column) thrown after sqoop execution.