Anton Karamanov

Software Engineer



2015 – Now

Middle Software Developer

Yandex, Moscow

LogBroker had became the de-facto standard for internal data transfer and currently spans multiple data centers in different geographical regions and hundreds of nodes.

As a middle developer I had continued developement of important LogBroker features including:

  • integrated distributed quotas and ACLs support to deal with increased number of producers and consumers migrating to the system;
  • developed an automated parser and validator derivations to enabled writing configuraiton in declarative style based on Scala data structures and case classes using shapeless library. Same auto-derived parsers/validators then could be used in multiple subsystems such as server-side and client-side validations of dynamicly applied settings for enforcing stronger correctness guarantees and ability to notify clients (usually operations) about incorrect settings, HTTP API for cluster control and pre-commit tests of configuration all without any additional boilerplate code. This allowed to dramatically simplify addition of new settings and configuration maintenance and prevent common errors during operations procedures, effectively preventing all configuration mistakes which could lead to system failure due to human error since feature release;
  • migrated client offset storage from ZooKeeper to new Kafka API;
  • worked on clean up of existing and development of additional internal tools to simplify debugging and operations of the system;
  • developed a tool for automated Kafka cluster restart, which restarts cluster node by node, checking that cluster has been restored to balanced state before proceeding to next step;
  • participated in development of a tool which automaticly checks and restores Kafka cluster on failures leading to data loss on some replicas in cases when data truncation is not allowed;
  • refactored a large portion of code, building a layer of abstraction over Kafka to simplify migration between versions and provide an ability to supporting an alternative backend for the system;
  • supported a new experimental storage backend developed in-house by another team for providing stronger guarantees when deliviring important data such as financial information via LogBroker.

2013 – 2015

Junior Software Developer

Yandex, Moscow

Currently I'm working on LogBroker project – large distributed system based on Apache Kafka. LogBroker serves as a centralized data bus, which controls internal company's data flow, reaching hundreds of Terabytes of throughput on a daily basis. It aggregates data from servers across the company, preprocesses it to unify format and delivers to the processing endpoints such as MapReduce clusters for batch processing or real-time log analytics systems, while ensuring lack of data loss and duplication (exactly once delivery semantics).

LogBroker is being successfuly used in production environment for over a year and gradually replaces old data flow infrastructure. 95% of data travels end-to-end from server of origin to processing endpoint in less then 3 seconds with goal towards reaching 99% mark.

During my involvment in the project I:

  • participated in migrating core components of to Akka framework;
  • worked on cleaning up service API for coherence, consistency and ease of use;
  • developed a dynamic configuration which allows to change properties at runtime via file update or setting an overriding value to ZooKeeper without having to restart the service;
  • developed an infrustructure for functional testing using Jenkins, Docker and ScalaTest, which automaticaly runs a full suite of tests for each commit and build;
  • participated in developing unit and functional tests for the project;

In the course of development I've had expirience with Scala programming language, Akka and Spray frameworks, Apache Kafka and ZooKeeper projects. I've also learned a lot about building fault-tolerant systems and distibuted service architecture.

2012 – 2013

Intern Software Developer

Yandex, Moscow

Worked on MapReduce jobs for ad hoc analytics written in Perl and Python.

Participated in online chart editor development for internal statistical reports publication platform. Developed a JavaScript library for report data processing on a client side in order to load it into chart rendering engine.


2010 – 2014

Bachelor of Information Technology

MSIREA, Moscow




  • Scala
  • Python
  • JavaScript


  • C/C++
  • Java
  • Clojure
  • Haskell



  • vim
  • git
  • sbt
  • Docker
  • Vagrant
  • Jenkins



  • Akka
  • Spray
  • Typesafe Config
  • json4s
  • slf4j/log4j
  • ScalaTest
  • ScalaCheck
  • Mockito