About
At PropertyLens, we’re a team of data specialists and technologists who firmly believe that the information gap between buyers and sellers of real estate could be - and should be - closed. We know - we’ve bought homes before and been terrified by the prospect of the “unexpected surprise” expense. Our mission is to empower every homebuyer with comprehensive knowledge about their prospective property, ensuring confidence in their choice and the safety of their investment.
As a data engineer at PropertyLens, you'll work on a small interdisciplinary team of data engineers, scientists, and analysts to build and maintain the datasets and models driving 100% of our revenue. We're a data-first company with strong executive buy-in and a culture for innovation. Our number one priority is finding and making accessible any data that may help a homebuyer make a more confident purchasing decision.
Responsibilities
- Integrate third-party data providers and create new datasets to enrich user property reports
- Create and maintain data extraction and transformation services in existing data pipelines
- Develop new APIs to support future product development
- Research and develop new mechanisms (machine learning, supplemental data, geospatial models) for deriving valuable insights from large property datasets
- Support and maintain existing databases, document stores, and related infrastructure
- Integrate existing e-commerce and marketing analytics platforms into reporting dashboards supporting executive and product decision-making
Pay and Benefits
💵 $110,000 to $140,000 depending on experience
💰 Competitive stock options in a seed-stage startup
🏥 Comprehensive medical, dental, and vision coverage (employee 100% company-paid)
🏖️ Unlimited PTO vacation policy
👩💻 Remote working with hybrid option for Cincinnati Metro employees
Requirements
- 2-4+ years as a software engineer writing Python, experience shipping services with FastAPI/Flask/LiteStar and working with geospatial/data science libraries (GeoPandas, GDAL, Polars).
- 2+ years deploying ETL-like pipelines on AWS with Lambda, Step Functions, Batch, Glue, Airflow, etc.
- 2+ years interacting with Postgres, experience working with spatial data (PostGIS) a plus
- Academic background in data science, GIS, or software engineering.
- U.S. candidates only, with a strong preference for those in the Cincinnati Metro Area.
- Bonus but not required! Familiarity with ML model selection for image classification, NLP, recommender systems and relevant Python libraries.