ru Русский
Welcome, {{ visitor.name }}, to my personal page!
A software developer for about 18 years , and counting.
Mostly was busy in projects handling high volumes of data, often in real time, and then doing something clever with them :-) I love writing smart software and not afraid to dig into the science of things for that.
Was working remotely for many years. Had to learn to be disciplined and organized since my pay was directly connected to quality and overall results of my work.
Also, I value my passion to a particular project much higher than the pay.
Mealmind my own project
When you're a software developer, you get paid for what you're good at right now. This creates a vicious cycle when you get more and more specialized knowledge which makes your employer happy but makes it more and more difficult for you to turn to anything new.
This is why in 2017 I left my last cushy job and decided to spend some time expanding my skills in new directions that interested me.
Mealmind has become a polygon for my work to catch up with new IT technologies in multiple areas, including:
I improved my Rust, which I used for data processing (tokio, SeaORM), and for backend and frontend as well. Eventually replaced many parts written in JavaScript and Python with Yew and Actix.
Did a lot of Deep Learning (neural networks) in general. Had a chance to work with multiple neural network architectures:
In the process I had to do a lot of hyperparameter tuning, regularization and optimization, learn how to structure such ML projects and use pre-training to save on training efforts.
Got used to reading, dissecting and reproducing complex research papers (like in the field of Reinforcement Learning and NLP, optimisation theory).
Learned GPU-accelerated computing with OpenCL, including writing custom ML kernels.
Got experience in deployment with containers, their management and orchestration, using Docker, Kubernetes and Ansible.
Although not new to monitoring, this time I had to build it myself, using Eclipse Mosquitto, Grafana and Telegraf.
Sadly, due to the... let's say, global political and market changes, I had to abandon the project. Still, I'm grateful for the broad experience it gave to me.
IPONWEB BIDSWITCH
IPONWEB specialises in programmatic and real-time advertising technology and infrastructure. One of its divisions, BidSwitch was created to help solve many of the underlying technical challenges and inefficiencies that hamper platform interconnectivity and trading at the infrastructure level.
Originally I started as a web frontend and backend developer using Python, Django, and Angular.
During my time there, I helped to start several internal projects that later grew into new products or parts of them: most notably BidSwitch UI, internal financial reporting, ad traffic forecasting, automatic creative approval, some APIs for clients.
Last years in the company I spent primarily focused on high-load backend projects, related to processing and analyzing large amounts of data and communicating with other services. During this time I used C++ and Python, had extensive experience with PostgreSQL and non-relational clusters of Cassandra and MongoDB, as well as many critical parts of modern IT infrastructure, including automatic monitoring, testing, real-time error reporting, continuous integration and delivery.
Designed and implemented a whole online market of actors playing the role of Santa (Ded Moroz in Russia) and providing adjacent services. Integrated the project with online payment system QIWI. Technologies: Python, Django, JavaScript, HTML, CSS, PostgreSQL.
Smart Links is an marketing and advertising company. There I was responsible for
The system was written in C and relied on ZeroMQ to provide connectivity. It was capable of processing hundreds of documents (HTML pages) per second in Russian or Ukrainian languages on a typical home PC, normalizing texts and identifying key parts that could be targeted with ads.
{% dated_content( title="Senior Software Developer (as a freelancer)", lead="Artela.ru (startup)", date="February 2010 - November 2010" tech="Python, Django, JavaScript, HTML, CSS, PostgreSQL", ) %} Was responsible for design and implementation of an architecture for a universal store of digital services like VoIP telephony, web domains, hosting.
This was my first commercial project where I could apply insights I learned from Alan Cooper's book "About Face" and other similar publications. I got to practice writing user stories and design the whole structure of user interface.
Integrated the store with several online payment systems: Webmoney, Yandex Money and Paypal. {% end %}
{% dated_content( title="Senior Software Developer (as a contractor)", lead="Gzt.ru news agency", date="February 2009 - September 2009", tech="Python, Django, JavaScript, MySQL, PostgreSQL, Nginx, HTML, CSS, SVN, Trac", ) %} Design and development of both the backend and partially the frontend (heavily customized Django admin CMS) parts of the portal.
Due to high amounts of trafic and very dynamic nature of the content, the project required a carefully crafted data model, complex fine-tuned SQL queries, and various caches at different levels. {% end %}
Domik63.ru Real Estate Information Portal
Was responsible for the development of both the backend and the frontend parts of the portal, and then for its maintenance & support.
Integrated portal's database with multiple real estate agencies by providing tools for regular updates of information through regular spreadsheets.
Technologies: Python, Django, PostgreSQL.
Inter-M, Web Design Studio
Backend Web developer. Technologies: Python, Django, PostgreSQL, Linux, Mercurial.
Unkom, Web Application Development Company
Originally came as a backend web developer, later transitioning to PHP and Java. Created a CMS system and a forum engine used by the company as a base for multiple websites for businesses and government organizations. Notably, created backend for the main (MuzTV)[https://muz-tv.ru] site and forum.
Main technologies used: Perl, PHP, .NET, Java and MySQL with PostgreSQL. OS: Linux (mostly) / Windows
There were plenty over the years, so I listed only those for which I have formal documents.
4-year Diploma of Incomplete Higher Education
I love gardening. After months of studying how to grow plants the right way, I have created a calculator which uses mathematical optimization (linear programming) to make sure my mix of fertilizers always fits recommendations from experts in plant nutrition and contains everything necessary for plant's healthy growth.
With this calculator, you can quickly create a fully balanced Dr. Jacob Mittleider`s Weekly-Feed mix using virtually any sorts of fertilizers locally available to you.
The extra benefit of this tool is that it can guide the gardener step-by-step through the process of creating the mix by pointing on problems with the existing recipe.
Despite Markdown becoming de-facto standard for markup in comments and some blogs, it has quite limited expressiveness when it comes to creating rich online publications.
Textile, on the contrary, was created with CMS and complex publications in mind. For instance, it makes it possible to create out-of-the-ordinary content blocks without the need to use HTML. Personally, I used it quite a lot in my practice, and when the need arose I ported PHP-Textile parser to Rust. The port is not beautiful since I was trying to preserve the original code structure as much as possible to simplify porting of new features, but it's functional :)
While I was fiddling with CNNs and Transformers for text processing, I decided to try them on time series predictions, like stock and cryptocurrency values.
So I wrote a tool that could
This have not made me rich, but helped to realize that WaveNet is goot not only for natural speech processing :)
Keras-transformer is a library implementing nuts and bolts for building (Universal) Transformer models using Keras. It allows you to assemble a multi-step Transformer model in a flexible way.
The library supports positional encoding and embeddings, attention masking, memory-compressed attention, ACT (adaptive computation time). All pieces of the model (like self-attention, activation function, layer normalization) are available as Keras layers, so, if necessary, you can build your version of Transformer, by re-arranging them differently or replacing some of them.
For those who don't know, The (Universal) Transformer is a deep learning architecture described in arguably one of the most impressive DL papers of 2017 and 2018: Attention is All you need and "Universal Transformers" by Google Brain team.
Their authors brought the idea of recurrent multi-head self-attention, which has inspired a big wave of new research models that keep coming ever since, demonstrating new state-of-the-art results in many Natural Language Processing tasks, including translation, parsing, question answering, and even algorithmic tasks.
Unlike classical recurrent neural networks, Transformer trains much faster measured both as the time per epoch and the wall clock time. It's also capable of efficiently handling multiple long-term dependencies in texts.
When applied to text generation, Transformer creates more coherent stories, which don't degrade in quality with the growth of their length, as it is typically the case with the recurrent networks.
KERL is a collection of various Reinforcement Learning algorithms and related techniques implemented purely using Keras.
The goal of the project is to create implementations of state-of-the-art RL algorithms as well as a platform for developing and testing new ones, yet keep the code simple and portable thanks to Keras and its ability to use various backends. This makes KERL very similar to OpenAI Baselines, only with focus on Keras.
What works in KERL:
All algorithms support adaptive normalization of returns Pop-Art, described in DeepMind's paper "Learning values across many orders of magnitude". This greatly simplifies the training, often making it possible to just throw the algorithm at a task and get a decent result.
With KERL you can quickly train various agents to play Atari games from pixels and dive into details of their implementation. Here's an example of such agent trained with KERL (youtube video): Deep RL A2C Agent Playing Ms. Pacman
Limitations: Currently KERL does not support continuous control tasks and so far was tested only on various Atari games supported by The Arcade Learning Environment via OpenAI Gym.
Avalanche is a simple deep learning framework written in C++ and Python. Unlike the majority of the existing tools it is based on OpenCL, an open computing standard. This allows Avalanche to work on pretty much any GPU, including the ones made by Intel and AMD, even quite old models.
The project was created as an attempt to better understand how modern deep learning frameworks like TensorFlow do their job and to practice programming GPUs. Avalanche is based on a computational graph model.
It supports automatic differentiation, broadcasted operations, automatic memory management, can utilize multiple GPUs if needed.
The framework also works a backend for Keras, so if you know Keras, you can begin to use Avalanche without the need to learn anything about it.
CartGP is a very simple and minimalistic C++/Python library implementing Cartesian Genetic Programming (CGP). The library currently supports classic form of CGP where nodes are arranged into a grid and no recurrent connections are allowed.
Check this jupyter notebook to see how to use the library from Python.
Apart from being a software developer and AI enthusiast, I enjoy running bicycling, modeling devices in FreeCAD and printing them with my own DIY 3D printer.
I am also familiar with electronics at a level where I can design and build my own a few-hundred watts power supply entirely from scratch, or an MCU-based digital sensor.
Finally, I love plants and gardening with my family. I even have a blog entirely dedicated to this activity.