Products DataWeave : We, the Products team at DataWeave, build data products that provide timely insights that are readily consumable and actionable, at scale.
Our underpinnings are : scale, impact, engagement, and visibility.
We helpbusinesses take data driven decisions everyday.
We also give them insights for long term strategy.
We are focused on creating value for our customers and help them succeed.How we workIt's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are.
We are in the business of making sense of messy public data on the web.
Atserious scale! Read more on Become a DataWeaverWhat do we offer?- Opportunity to work on some of the most compelling data products that we are building for online retailers and brands.
Fun work environment.
A flat hierarchy.
Organization wide visibility.
Flexible working hours.- Learning opportunities with courses, trainings, and tech conferences.
Mentorship from seniors in the team.- Last but not the least, competitive salary packages and fast paced growth opportunities.
Roles and Responsibilities : Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analyticsfunctionality Build robust RESTful APIs that serve data and insights to DataWeave and other products Design user interaction workflows on our products and integrating them with data APIs Help stabilize and scale our existing systems.
Help design the next generation systems. Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
Work closely with the Head of Products and UX designers to understand the product vision and designphilosophy Lead / be a part of all major tech decisions.
Bring in best practices.
Mentor younger team members andinterns. Constantly think scale, think automation.
Optimize proactively. Be a tech thought leader.
Add passion and vibrancy to the team.
Push the envelope.Skills and Requirements : 5-7 years of experience building and scaling APIs and web applications. Experience building and managing large scale data / analytics systems.
Have a strong grasp of CS fundamentals and excellent problem solving abilities.
Have a good understanding of software design principles and architectural best practices. Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
Be able to argue convincingly why feature X of language Y rocks / sucks, or why a certain design decision is right / wrong, and so on.
Be a self-starter someone who thrives in fast paced environments with minimal management . Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ. Use the command line like a pro.
Be proficient in Git and other essential software development tools. Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
Working knowledge of building websites and apps.
Good understanding of integration complexities and dependencies. Working knowledge linux server administration as well as the AWS ecosystem is desirable.
It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time.
Show off some of your projects you have hosted on GitHub.
Skills : - Python, NOSQL Databases, RESTful APIs, MySQL, Apache Kafka, Big Data and Scalability