Join a Leading Innovator in Tech Solutions!
Are you an expert in AWS and data architecture looking to make an impact? Dive into the opportunity to work with one of the top technology companies recognized for groundbreaking achievements in communications, IT infrastructure, and cybersecurity. With a history of over 20 years, our partner is trusted by major market players across sectors like banking, telecommunications, and public administration.
Why You’ll Love This Role:
- Flexibility & Location: Embrace a remote-first environment with flexible working hours, contributing to dynamic Western European projects.
- Professional Growth: Access cutting-edge technologies with personalized development plans, mentoring, and an Internal Academy offering training, certifications, and language courses.
- Well-being & Health: Enjoy a Multisport card, wellness workshops, annual ski trips, and regular team-building events.
- Family & Personal Life Support: Celebrate life's milestones with family perks including birthday gifts, wedding bonuses, new baby bonuses, and family weekend programs.
- Competitive Salary: Earn €3,500 per month.
Your Key Responsibilities:
- Design and implement automated tools for efficient data collection and AWS cloud transfer.
- Collaborate on large-scale data processing and analytics projects for international clients.
- Support the transition of clients from batch to near-real-time data processing.
- Consult on risk management, customer experience, and process optimization across industries.
- Lead data preparation and transformation projects in sectors such as banking, telco, automotive, and finance.
What We Are Looking For:
Mandatory Qualifications:
- Proficient in Spark, PySpark, and Python.
- Deep experience with AWS services: EMR, S3, Athena, EC2, Glue, Redshift, and RDS.
- Strong knowledge of the Hadoop ecosystem including HDFS, HBase, Hive, Presto, and Kafka.
- Bash scripting and Linux administration proficiency.
- Familiarity with databases such as PostgreSQL, MySQL, and Oracle.
- Experience in data modeling and batch data processing.
- Competent in version control systems (Git/SVN) and tools like Jira.
Preferred Qualifications:
- Experience with streaming data services (Kinesis).
- Basic understanding of Databricks.
- Familiarity with Infrastructure-as-Code tools like Terraform.
About Us:
Join a vibrant and professional team known for innovative solutions and an ever-evolving approach to technology. Our culture values your contributions and is committed to fostering an environment of growth and development.
Ready to Join Us?
Does this opportunity feel like the right fit for you? Send us your CV via Skilleto today, and we'll get back to you soon!