Big Data Developer | Global Data & Content

US-TX-Corp - Westlake
Regular Full-Time


Solera is the world’s leading provider of software and services to the automobile insurance claims processing industry. Created in 2005 with a vision to transform the way the industry operates, Solera is now active in over 80 countries across six continents.


Our Global Data & Content team develops global solutions and is part of an ambitious new program to redesign the way we extract insights from the data. Not only the team is responsible for the insights, but has control on the end to end process, starting from data acquisition. You will be exposed to learning new technologies and will have the opportunity to develop yourself professionally. We are organized as a tribe, with multiple squads and no hierarchies, just a group of volunteers willing to make our future happen.


In the beginning, software was 20% of the process and the human was 80%. If you fast forward into the future, where we are today and across the next five years, software moves to become 80%.


Our team is looking for a driven Big Data Developer to be based at Solera offices in Westlake (Dallas / Fort Worth area). This role exists within a team that develops global solutions. 

Position 80s

  • Writing new and reusable code to support our in house applications in an agile, test driven development environment
  • Develop scalable batch and streaming applications using Hadoop, being responsible for their end-to-end lifecycle (including design, develop, test and deployment)
  • Follow the agreed quality standards and testing coverage
  • Support in ETL, Data analysis and modelling tasks. Integrating massive amounts of data from different sources.
  • Develop, execute and maintain manual and automated test scripts
  • Support a continuous improvement by investigating possible alternatives and new technologies.
  • Provide technical guidance and coaching to less experienced developers
  • Showcase agile methodologies and prove its potential to achieve ambitious objectives quickly and in an structured manner


  • Comprehensive knowledge of hardware, software, application, and systems engineering
  • Good organizational, multi-tasking, and time-management skills
  • Strong time management and organizational skills. Must be able to work under time pressure
  • Strong capabilities of abstraction, rational thinking and understanding of complex problems
  • Ability to explain complex technical issues in a way that non-technical people may understand
  • Fluent English and Spanish on business level (written and spoken) 


Must Have:

  • Bachelor’s degree in computer science or numerate discipline
  • The position requires at least 5 years of experience in software development
  • 2 years of experience with the Hadoop ecosystem
  • Experience with Big Data technologies such as Hadoop stack, Kafka, Spark.
  • Deep knowledge of Java and Scala
  • Extensive knowledge of Design Patterns and SOLID principles
  • Strong data modelling skills
  • Experience with Relational databases and knowledge of NoSQL
  • Continues Integration/Delivery/Deployment tools and methodologies: Maven, GIT, Nexus, Bamboo/Jenkins, Sonar...
  • Strong with agile development concept and experience working in Agile/SCRUM environment

Highly Desirable:

  • Experience with Stream Processing
  • Experience with Hortonworks distribution


  • Highly competitive pay and health & wellness plans
  • 401K
  • Tuition Reimbursement
  • No B.S. policy that promotes transparency and accountability
  • Beautiful and uncommon workspaces to collaborate and unwind
  • Free gym membership (to the awesome gym that’s right next to our office)
  • Free meals, healthy snacks (like nuts and yogurt parfaits), some indulgent snacks (like baked chips and dark chocolates), and refrigerators full of juices, teas and other life-essential beverages (including Red Bull)
  • The latest and greatest in all things technology
  • Lots and lots of awesome cars


Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed