<p> </p><p><strong>About Smart Working</strong><br>At Smart Working, we believe your job should not only look right on paper but also feel right every day. This isnât just another remote opportunity - itâs about finding where you truly belong, no matter where you are. From day one, youâre welcomed into a genuine community that values your growth and well-being.</p> <p>Our mission is simple: to break down geographic barriers and connect skilled professionals with outstanding global teams and products for full-time, long-term roles. We help you discover meaningful work with teams that invest in your success, where youâre empowered to grow personally and professionally.</p> <p>Join one of the highest-rated workplaces on Glassdoor and experience what it means to thrive in a truly remote-first world.</p> <p><strong>About the Role</strong><br>This is a long-term, strategic role, not a short sprint. You'll be embedded in a collaborative engineering and analytics team, working across the full data lifecycle: ingestion, transformation, modelling, and surfacing insights through Looker. You'll work closely with stakeholders across commercial, product, and marketing to ensure data is reliable, scalable, and meaningful.</p> <p>You'll be given real ownership. This is a role for someone who wants to shape standards, improve the architecture, and grow with a brand that takes its data seriously.</p> <p></p>\n<p></p><p><br></p><b>Responsibilities</b><ul> <li>Design, build, and maintain robust ETL/ELT pipelines that move data from source systems into Google BigQuery, ensuring reliability, scalability, and observability at every stage.</li> <li>Develop and enforce data models and schema standards using best-practice SQL and dimensional modelling principles, with a focus on clarity, reuse, and performance.</li> <li>Own the Google BigQuery environment, optimising queries, managing costs, enforcing data governance, and ensuring the platform scales alongside the business.</li> <li>Build and maintain Looker explores, LookML models, and dashboards that translate complex datasets into clear, actionable business intelligence for non-technical stakeholders.</li> <li>Work across the full Google Cloud Platform stack, including Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, and Composer, to architect end-to-end data solutions.</li> <li>Partner with analytics, engineering, and commercial teams to understand data requirements and translate business problems into scalable technical solutions.</li> <li>Champion data quality and testing frameworks, implementing monitoring and alerting so that issues are caught early and resolved quickly.</li> <li>Contribute to documentation, coding standards, and architectural decision records so the team can move fast with confidence.</li> <li>Mentor junior data team members and set the bar for engineering rigour across the data function.</li> <li>Stay current with developments in the modern data stack and proactively recommend tooling or process improvements where appropriate.</li> </ul><p><br></p><b>Requirements</b><ul> <li>5+ years of experience in SQL and data modelling, with strong command of dimensional modelling, star schemas, and performance optimisation.</li> <li>3+ years working with Google BigQuery in a production environment.</li> <li>3+ years hands-on experience with Google Cloud Platform (Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, Composer).</li> <li>3+ years building and maintaining ETL/ELT pipelines at scale.</li> <li>1+ year working with Looker and LookML to deliver business-facing dashboards and data products.</li> <li>Demonstrable experience leading at least one data project end-to-end, from scoping through to delivery.</li> <li>Able to communicate clearly with non-technical stakeholders about data limitations, timelines, and trade-offs.</li> <li>Comfortable making pragmatic architecture decisions in a cloud-native, modern data stack environment.</li> </ul><p><br></p><b>Nice to Have</b><ul> <li>Experience with dbt (Data Build Tool) for transformation layer management and testing.</li> <li>Familiarity with orchestration tools such as Apache Airflow or Cloud Composer.</li> <li>Python skills for pipeline scripting, data validation, or automation.</li> <li>Background in retail, ecommerce, or fashion, understanding how data flows across commercial and digital channels.</li> <li>Exposure to real-time or streaming data pipelines using Pub/Sub or Dataflow.</li> <li>Experience with Terraform or Infrastructure-as-Code practices in a GCP context.</li> <li>Familiarity with data governance frameworks, cataloguing, and lineage tracking.</li> </ul><p><br></p><b>Benefits</b><ul> <li>Fixed Shifts: 12:00 PM - 9:30 PM IST (Summer) | 1:00 PM - 10:30 PM IST (Winter)</li> <li>No Weekend Work: Real work-life balance, not just words</li> <li>Day 1 Benefits: Laptop and full medical insurance provided</li> <li>Support That Matters:Mentorship, community, and forums where ideas are shared</li> <li>True Belonging: A long-term career where your contributions are valued</li> </ul><p><br></p><p></p>\n<p> </p><p>At Smart Working, youâll never be just another remote hire.</p> <p>Be a Smart Worker - valued, empowered, and part of a culture that celebrates integrity, excellence, and ambition.</p> <p>If that sounds like your kind of place, weâd love to hear your story. </p> <p></p><br/><br/>Please mention the word **ENGAGING** and tag ROC4yMjguMTExLjcx when applying to show you read the job post completely (#ROC4yMjguMTExLjcx). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.