About CBRE :
CBRE Group, Inc. is the worlds largest commercial real estate services and investment firm, with a revenue of $26.106 Billion (as recorded on September 30, 2021) and more than 100,000 employees (excluding affiliate offices).
CBRE has been on the Fortune 500 list each year since 2008, ranking #122 in 2021.
It has also been voted Industrys top brand by the Lipsey Company for 21 consecutive years, in addition to being one of Fortunes Most Admired Companies for 12 years in a row, including being ranked number one in the real estate sector in 2020, for the second consecutive year.
CBREs Digital and Technology (D&T) organization is dedicated towards revolutionizing the real estate space with its software products.
Our breakthrough products have brought real estate management to the fingertips for our clients.
Our small, fast-paced teams are responsible for creating innovative software that enhances the experiences of both our internal and external clients.
As part of CBREs Digital & Tech organization, youll be able to learn from the most brilliant software engineers and designers while tackling tough problems, solution to which will drive our technology forward.
Key Responsibilities :
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements : automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Keep our data separated and secure across national boundaries through multiple data centers and regions.
Work with data and analytics experts to strive for greater functionality in our data systems.
Test databases and perform bug fixes.
Develop best practices for database design and development activities.
Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code.
Take on technical leadership responsibilities of database projects across various scrum teams
Manage exploratory data analysis to support database and dashboard development
Required Skills :
Expert knowledge in Database like PostgreSQL (preferably cloud hosted), any cloud based Data Warehouse (like Snowflake , Azure Synapse) with strong programming experience in SQL.
Competence in data preparation and / or ETL tools like snapLogic, MATILLION, Azure Data Factory, AWS Glue, SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows.
Programming language experience in Golang, Python, shells scripts (bash / zsh, grep / sed / awk etc..).
Deep knowledge of databases, stored procedures, optimizations of huge data
In-depth knowledge of ingestion techniques, data cleaning, de-dupe, partitioning.
Experience with building the infrastructure required for data ingestion and analytics
Ability to fine tune report generating queries
Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques
Understanding of index design and performance-tuning techniques
Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions
Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting
Adhere to standards for all database , Data Models, Data Architecture and Naming Conventions
Exposure to Source control like GIT, Azure DevOps
Understanding of Agile methodologies (Scrum, Kanban)
Preferably experience with NoSQL database to migrate data into other type of databases with real time replication.
Understanding of data modelling techniques and working knowledge with OLTP and OLAP systems
Experience with automated testing and coverage tools
Experience with CI / CD automation tools (desirable)
EDUCATION and EXPERIENCE
Bachelor's degree (BA / BS) in a related field such as information systems, mathematics, or computer science or equivalent work experience.
Requires technical knowledge in multiple disciplines / processes. Typically has 2 - 5 years of relevant work experience.