Lazy Evaluation in Apache Spark

Lazy evaluation in Spark means that the execution will not start until an action is triggered. The Spark Lazy evaluation, users can divide into smaller operations. It reduces the number of passes... Read more »

How to create RDD in spark – Interview question

Read more »

Hive Basic Important command in Hindi

Read more »

Sqoop import – Part 1- Hadoop series

Read more »

Default Number of mapper and reducer in SQOOP job

Updated: Dec 12, 2018 #hadoop #sqoop #defaultmapper #defaultreducer #hadoopinterviewquestion In this post we are going to focus the default number of mappers and reducers in the sqoop. scope is the part of Hadoop... Read more »