At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto.
Founded in 2017, Anchorage Digital is a regulated crypto platform that provides institutions with integrated financial services and infrastructure solutions. With the first federally chartered crypto bank in the US, Anchorage Digital offers institutions an unparalleled combination of secure custody, regulatory compliance, product breadth, and client service. We’re looking to diversify our team with people who are humble, creative, and eager to learn.
We are a remote friendly, global team, but provide the option of working in-office in New York City, Sioux Falls, Porto, Lisbon, and Singapore. For our colleagues not located near our beautiful offices, we encourage and sponsor quarterly in-person collaboration days to work together and further deepen our Village.
Join the Asset Data team and build the streaming data infrastructure that powers Anchorage's digital asset platform. You'll design systems that ingest real-time blockchain and market data from diverse providers, transforming raw feeds into certified, trusted data products. We're creating contract-governed supply chains that let us onboard new assets and providers quickly while maintaining the low-latency, high-availability SLOs our business depends on.
We have created the Factors of Growth & Impact to help Villagers better measure impact and articulate coaching, feedback, and the rich and rewarding learning that happens while exploring, developing, and mastering the capabilities and contributions within and outside of the Asset Data role.
Technical Skills:
- Build streaming data pipelines for blockchain data (onchain transactions, staking rewards, validator info) and market data (prices, trades, order books)
- Design and implement data contracts and validation gates that enforce quality and schema compliance at ingestion points
Complexity and Impact of Work:
- Collaborate on designing the architecture for standardized ingestion patterns that enable rapid onboarding of new blockchains and market data feeds
- Establish redundancy and failover patterns to meet Tier 1 availability and freshness SLOs for critical data products
Organizational Knowledge:
- Collaborate with Protocols, Trading, and Custody teams to understand their data needs and design certified data products with clear SLAs
- Partner with Data Platform team on orchestration, storage patterns (BigLake), and metadata management (Atlan)
Communication and Influence:
- Advocate for contract-governed data supply chains and help establish engineering standards for producer patterns across the org
- Contribute to architectural decisions and help mature the team's practices around observability, testing, and operational excellence
You may be a fit for this role if you:
- 5-7+ years building streaming or high-throughput data systems: You have experience designing and operating production data pipelines that handle large volumes with low latency and high reliability
- Solid backend engineering skills: You're proficient in Go or Python and have built services that interact with streaming infrastructure (Kafka, pub/sub, websockets, REST APIs)
- Blockchain data familiarity: You understand blockchain concepts and are comfortable working with on-chain data (transactions, events, staking, validators) across multiple chains with different data models
- Data engineering adjacent skills: You're comfortable with data transformation patterns, schema evolution, and working with cloud data warehouses (BigQuery) and storage systems (GCS, BigLake)
- Operational mindset: You have experience deploying and operating services on cloud platforms (preferably GCP), with strong practices around monitoring, alerting, and incident response
Although not a requirement, bonus points if:
- Staking data expertise: You've worked with staking rewards, validator data, or proof-of-stake blockchain infrastructure
- Market data systems: You've built systems that ingest and process market data (prices, trades, order books) from exchanges or data vendors
- Infrastructure as code: You have experience with Terraform, Kubernetes, and modern DevOps practices
- You're the kind of person who gets excited about data freshness SLOs and celebrates when p95 latency drops by 50ms