I have been a software engineer for close to 6 years, offering 5+ years of java and 2 years Python programming experience. I recently relocated to Germany and I am looking for full time employment in Berlin. During these 6 years I spent 2 years working as a data engineer where I designed, built, developed and maintained Hadoop clusters, Kafka Clusters, Spark on Yarn, Airflow Clusters, Spark apps (with java), Airflow tasks (with Python), Redis, Grafana, flume, Hive, Mysql, Mssql, and Big-query. My Data engineer experience also enabled me to work with ETL. The data was ingested in Hadoop Clusters which processed extract, transform, load etc handled by spark. Processed data was stored in Elasticsearch, MSSQL and another Hadoop Cluster. I also implemented two types of Spark apps for batch and streaming. Prior to being a Data engineer I worked as DevOps engineer for 2 years, designing, building and implementing Elasticsearch, Airflow, KairosDB, Chef, Redis, Kafka and mesosphere. I also created an auto configuration infrastructure management system using chef and mesosphere. Before this I was a Web back-end engineer for 2 years designing numerous schema for relational databases. I offer 2 years of experience with NoSQL Redis, KairosDB. I also know Cassandra but it was not used consistently during my career and thus, I am not an expert.
©