Kollective Technology, Inc. is the global leader in providing end-to-end enterprise video delivery solutions to medium and large enterprises. With the Kollective solution, you can deliver high-quality live video, streamed video, and video file download to all of your global employees, regardless of network connectivity limitations and without deploying any networking or caching hardware.
This position is based in Bend, Oregon at headquarters office. Remote Negotiable.
Are you passionate about building data pipelines that leverage the latest cutting-edge technologies in big data? The Kollective eCDN platform gathers data from millions of devices around the world, and we need your help to wrangle it! As a Data Engineer at Kollective, you will architect and be part of the team that owns the pipeline and tools that allow us to extract valuable insights from our data and provide those insights back to our customers through stunning and highly performant visualizations. You’ll work closely with engineering and product teams to optimize our data infrastructure for maximum scalability and security enabling machine learning and BI use cases. Be prepared to work in a fast-paced and highly collaborative agile environment and help us deliver the next generation of features for the Kollective platform.
- Build and maintain a scalable, secure, and cost-effective big data pipeline architecture
- Build analytics tools for data mining, machine learning, and BI use cases
- Work with software engineers to optimize data ingestion and schema for processing
- Work with infrastructure engineers to build and automate scalable infrastructure and keep our data secure
- Work with product teams and business analysts to extract actionable insights from our data
- Actively seek out and implement ways to improve our underlying processes and technologies
- BS in Computer Science, Engineering, or equivalent related field
- 3+ years of industry experience building and optimizing big data pipelines and datasets
- Advanced SQL knowledge and experience working with relational databases including PostgreSQL
- Strong working experience with Python, Java, or Scala
- An ability to work under pressure while maintaining a sense of humor
Nice to Haves:
- Experience with a variety of big data tools including Apache Spark and Apache Kafka
- Experience with Azure cloud services; Event Hubs, Data Lake Storage, Databricks, CosmosDB
- Experience with stream-processing systems including Spark Streaming or similar products
- Experience with data lakehouse architectures and building ETL pipelines to create clean structured data models
- Strong analytic skills related to working with unstructured datasets
Please be prepared to provide 2-3 professional references as a part of our application process
HOW TO APPLY
Please send your resume and cover letter to email@example.com. No phone calls please.
The largest, most successful, global companies trust Kollective Technology to power their Enterprise Live and On-Demand video delivery, serving millions of users worldwide. From its software defined enterprise content delivery network (SD ECDN) to related IT tools like Network Readiness Testing and Network Analytics, Kollective drives a powerful ROI and makes the flexibility of software defined networking a reality.
Do you have a desire to put your skills into practice in a small company serving over a hundred of the world’s largest corporations? Are you a big picture thinker who pays attention to the smallest of details? Are you a hands-on leader willing to go the extra mile and do what it takes to reach your goals? Join a rapidly growing company at the cutting edge of large-scale content delivery and make a real impact.