Our client is a leader in the single-family rental (SFR) investment market, offering a comprehensive platform designed to make real estate investing more accessible, cost-effective, and straightforward. They combine a deep passion for helping investors build wealth through real estate with cutting-edge technology that redefines the investment process. With a dynamic team of over 600 professionals, their collaborative and proactive culture drives their rapid growth. After closing a Series E funding round last year and raising $240 million, the company continues to expand its presence with offices in California, Texas, and New York, alongside numerous remote opportunities. Their growth strategy includes the recent acquisitions of Great Jones (a full-service property management company), Stessa (financial management software), Rent Prep (tenant screening and placement services), and Mynd (a property management platform for both retail and institutional investors). About the team: The Data Engineering team is the core of Data, which everything else relies upon. The team is responsible for the development and management of the Enterprise Data Platform, which powers the company and all respective Data functions. The Enterprise Data Platform is crucial for integrating, managing, and providing data across the business. There are multiple sub-disciplines within Data Engineering, each contributing to the overall effectiveness and efficiency of data operations. It is a highly cohesive team consisting of the 4 pods, Data Infrastructure, ML & GenAI Ops, Analytics Engineering & Data Services. They architect and build the core data infrastructure to support the entire company, build data ingestions from internal & external applications, support infra for ML & GenAI products and applications, merge various data feeds and combining them into easy to use, valuable data sets to support analytics and design and create scalable and packaged data solutions in the form of various data services. About the role: We are looking for a talented Data Engineer to join the Data Services pod in the established Data Engineering team. As a Data Engineer, you will be instrumental in developing and maintaining data solutions and data pipeline infrastructure supporting broader data-heavy functions within the company, including Reporting, Analytics and Data Science. The Data Engineer collaborates with Software Development and DevOps teams to maintain scalability, reliability and maintainability of the company’s overall technical stack. The Data Engineer also works closely with business and product teams to ensure timeliness and efficiency of data-driven decision making. The primary focus of this role is to design, develop and implement clean and structured data for use by analysts and reporting. Responsibilities also include quality documentation and adherence to Data Engineering best practices. What you will do: ● Operate in a remote-first environment collaborating with individuals and teams distributed across multiple time zones worldwide ● Work with internal partner teams, and, occasionally, external partners on defining the scope and the requirements for Data Engineering projects ● Design, develop and implement Data Engineering solutions ● Maintain work performance, transparency, and process compliance through usage of the company’s task tracking tools and communication channels ● Participate in projects of a broader Engineering and Data teams by acting as SME and as a contributor to business and engineering initiatives ● Work on continuous improvement of Company’s Data Engineering practices through platform/tools evaluation, documentation, knowledge sharing, etc Qualifications: ● 4 years of technical experience ● Flexibility in adjusting to technology stack selection. Ability and desire to learn new technologies quickly ● Strong proficiency in data modeling ● SQL fluency, preferably with exposure to multiple dialects ● Scripting skills relevant to development of data pipelines (strong preference for Python) ● Understanding of QA practices relevant to Data Engineering ● Familiarity with source control tools and CI/CD ● Understanding of performance tuning across multiple components of the technical stack Nice-to-have: ● Knowledge of AWS and Azure cloud services. ● Previous experience in a start-up or agile environment. ● Additional experience with AWS, Snowflake, Fivetran, Airflow, DBT and Sigma is desirable. Location: This is a remote position; however, the ideal candidate should be available to work from 9 am to 12 pm Pacific Time.