SQL (Stored
Procedures and Views)
Data Warehousing
Data Pipelining
·Python
·Etl
Job
Description
The
standard Data Engineer, regardless of level/tenure uses Sql, Snowflake, and DbT
the most. Having Snowflake and DbT is preferred but if a local engineer has a
similar tool like databricks, redshift, google big query etc. with a strong sql
baseline they would still be considered. Handle the design and construction of
scalable management systems, ensure that all data systems meet company
requirements, and research new uses for data acquisition. Required to know and
understand the ins and outs of the industry such as data mining practices,
algorithms, and how data can be used. All efforts will be focused on 3 core
pillars integrating new data from Comerica and taking existing systems and
speeding them up for category jump/increase and mock cycles to see if systems
are ready to integrate Comerica data.
Information
Locations Position Open to Only localsIndustry Information TechnologyStatus OpenJob Age 14 Day'sCreated Date 01/14/2026No.of Positions 1Duration 6-12 monthsZip Code