Hello! I’m Ravi Kumar.Highly skilled Big Data Developer with extensive experience in designing, developing and implementing large-scale data solutions. Seeking a challenging position in a dynamic organization where I can utilize my expertise in Hadoop, Spark, and other Big Data technologies to drive innovative solutions nd achieve business objectives.
I have built Sqoop Importer Tool for transfer data between Hadoop (HDFS) and relational databases. This tool is GUI based and used python language for this tool. I have already release 1st version for this tool and enhancing the feature of this tool.
My role in this project is of Data Ingestion where I am ingesting multiple data csv/txt/compressed files from Amazon S3 into HDFS and further creating scripts based on the files ingested.
Responsible for reconciliation of cluster data into AWS S3 environment on weekly basis.
Automation of Ingestion process through python scripting.
Responsible for debugging user queries and correcting them.
DQM checks are performed on the ingested data as per provided summary sheet from the vendors.
Data movement from one cluster to production cluster.
Perform reconciliation on the data ingested in Ingestion cluster into s3.
Performing extensive cleaning and checking raw data provided by vendors.
Part of COE bigdata practice team where get aligned to multiple projects for Installation, configuration, enable security on multiple bigdata tools.
Involved in multiple RFP and POC activities.
Automated so many Project based tasks by using python.
Cross-cluster search: Implemented ELK (Elastic Search, Log stash, Kibana) stack to filter and analyze log data stored on clusters in different data centers.
Data Profiling: In this activity, I had raw data which contained unwanted value like null, Junk value, missing values and duplicate values. For improving the data quality, I removed all the unwanted values by using SparkSQL and made it suitable for visualization.
Migrate On-premise database to GCP Big Query : In this activity, I took the backup of required databases and converted all backup files into csv file format by using python script then transferred to Google cloud Storage after that using bq load, pushed all data in Big Query Datawarehouse. And I did Oracle Databases migartion to BigQuery by using Informatica Intelligence Cloud Services.
Spark Structured Streaming with Kafka :In this activity, I had 3 nodes Kafka Cluster, 1 node dedicated for Zookeeper and Installed standalone Spark and Hadoop for data processing and Storage the processed data. I got real-time data from weather website by using API and created Kafka Producer to ingest the weather data to Kafka topic. Then I used structured streaming for processing data and stored into HDFS. For the Visualization, I have used Kibana.
Spark Structured Streaming with Kafka :This activity I have done for one of the cab service provider industry. In this activity, I had a python application which generated the cab drivers related information’s in real-time. I have created custom Kafka Producer and using Structured streaming, processed the data for checking the driver’s availability like total numbers cab drivers, how many drivers are on duty etc.
In this organization, I joined as senior Big data developer and perfroming code devlopment, automating data pipeline and analysing data to project over business intelligence tool
• Associate Cloud Engineer Certified from Google.
• Certification course from NIIT in Object Oriented Programming.
• Certification course in Java from Shadow Infosystem.
• Attended Amazon day conference.
• Attended training on GCP in GOOGLE.
First of all I love music, Romantic music is my favorite. Love watching Technical Videos, movies, Web Series and playing games with my buddies. I spend quite a lot of time in traveling and photography, these keeps me fresh for working environment.When I feel free, also spend time in cooking and enjoying with friends.
"I thank to Mr. Ravi Kumar for the wonderful job in helping us develop our program. He was professional, excellent and hard working. Thanks to him, we were able to achieve our goal on time, and we look forward to continue working with them in the future."
"I would like to express our satisfaction on the co-operation regarding the development of our web application. Mr. Ravi did a very professional job. We are satisfied with the solution given to us and with the communication flow through the project. We would like to recommend him. We look forward to working with them in future projects."
"I’m still constantly surprised that, even after many years, Sassafras is still delivering consistent, responsive and outstanding support for a consistent, responsive and outstanding product."
If you have any questions about our service, feel free to contact us anytime. Simply use the form to the left, or one of the methods below.