Currently, I work at Enigma Technologies, Inc. as a data engineer, part of the data ingestion squad responsible of building and maintaining all of the data ingestion workflows within the company as well as assuring the quality & integrity of the data we deliver.
Capable of processing large sets of structured, semi-structured and unstructured data. Experienced with data architecture including data ingestion pipeline design and ETL workflows optimization, data modeling, data mining and advanced data processing.
Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.
Feel free to contact me at tahasadiki.pro@gmail.com
Data ingestion & infrastructure
I worked on a data integration and visualization project within the Talent Acquisition team where I built an ETL process (executed once a day) to integrate all the information from different sources we gathered on candidates through the steps of the hiring process into one single relational database. Then we connected this database to Metabase to inspect and analyze this data in order to improve the process of hiring and keep track of the performance of the talent acquisition team.
I developed a user-friendly decision support service by making predictions on time series with an automatic selection of the most suitable model (ARIMA, ETS, THETA, Structural models...). * Microservices Architecture * Central service: Java EE-Spring boot * Forecasting service: R language - Plumber
Real-time road traffic modeling for Bab-Zair intersection in Rabat in order to minimize waiting time (using PSO algorithm with tabu search) and reduce the number of conflicts and accidents. * Simulation software: PTV VISSIM * Analysis software: SSAM (surrogate safety assessment model)
I worked on a web application for quality management and monitoring compliance with internal standards and regulations. * Back-end: Java EE-Spring boot * Front-end : HTML5 & CSS3 Bootstrap * DBMS: SQLserver
EMI
Classe préparatoire aux grandes écoles
Mention très bien
I created a python command-line tool to scrape telecontact.ma then I Dockerize it and push it regularly to Docker Hub using Circleci as my CI/CD tool of choice. I used a multi-node setup (on AWS) for Apache Airflow to create a scraping workflow that will spin up multiple instances of the previously created docker image (telecontact scraper) using different queries to scrape the desired data from telecontact.ma website.
View ProjectWhile working on an ETL project using Singer.io, I created this Recruitee Tap to extract and transform data from Recruitee API to then load it to another data warehouse to prepare it for visualization & analytics
View ProjectThe project is about implementing an app that lists all the available pieces of information about nearby shops (including location, distance, phone numbers ...) based on the location of the user. Main technologies: Java + Spring boot + thymeleaf DB & API : PostgreSQL, TomTom API
View ProjectA simple React app (just for fun ^^ ) that communicates with two APIs RobotHash and JSONPlaceHolder to display Robots cards and do a real-time search by robot name.
View ProjectSuccessfully submitted