Big Data Architect

Frankfurt, Germany

Striving for excellence is in our DNA. Since 1993, we have been helping the world’s leading companies imagine, design, engineer, and deliver software and digital experiences that change the world. We are more than just specialists, we are experts.

DESCRIPTION


Currently we are looking for a Big Data Architect for our Frankfurt office to make the team even stronger.

These are high profile and visible roles within both EPAM and onsite with our clients where you will have a high degree of flexibility to own and enhance the technical landscapes.

Responsibilities

  • Design data analytics solutions by utilizing the big data technology stack;
  • Create and present solution architecture documents with deep technical details;
  • Work closely with business in identifying solution requirements and key case-studies/scenarios for future solutions;
  • Conduct solution architecture review/audit, calculate and present ROI;
  • Lead implementation of the solutions from establishing project requirements and goals to solution "go-live";
  • Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives;
  • Create and follow personal education plan in the technology stack and solution architecture;
  • Maintain a strong understanding of industry trends and best practices;
  • Get involved in engaging new clients to further drive EPAM business in the big data space.

Requirements

  • Strong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python;
  • Experience delivering data analytics projects and architecture guidelines;
  • Experience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud);
  • Production project experience in at least one of the big data technologies;
  • Batch processing: Hadoop and MapReduce/Spark/Hive;
  • NoSQL databases: Cassandra/HBase/Accumulo/Kudu;
  • Knowledge of Agile development methodology, Scrum in particular;
  • Experience in direct customer communications and pre-selling business-consulting engagements to clients within large enterprise environments;
  • Fluent English.

Nice to have

  • Practical experience in performance tuning and optimization, bottleneck problem analysis;
  • Experience in Linux based environments;
  • Understanding of data modelling challenges and techniques in an enterprise environment;
  • Stream processing: Kafka/Flink/Spark Streaming/Storm;
  • Background in traditional data warehouse and business intelligence stacks (ETL, MPP Databases, Tableau, Microsoft Power BI, SAP Business Objects).

We offer

  • Competitive compensation depending on experience and skills;
  • Regular performance assessments;
  • Opportunities for personal and professional growth;
  • Friendly and enjoyable working team;
  • Relocation package support;
  • Regular corporate and social events;
  • 30 days holiday per annum.