At Solution Seeker we perform real-time analyses of time-series data, using patented data processing algorithms, carefully designed and developed over time, and continuously improved by our brilliant researchers. Our relentless data scientists tailor applications to fit each customer's needs, providing insights and analyses that reveal to them the true value of data. Their combined efforts develop new ways and techniques for optimizing the performance of our neural networks.
Our portfolio has quite a broad range, where we receive data streams from thousands of industrial sensors on one side and, on the other, we have a rapidly developing AI delivering value to our customers, all happening in real-time. Our technology is the robust platform that makes this possible today, and will continue to facilitate our ambitions in the future.
All the value and potential would however, without the software and infrastructure to realize it, remain caged in theorems and proof-of-concepts on pieces of paper, and calculations using year-old data on a computer in the basement of our Chief Research Officer, Bjarne Grimstad.
This is where the development team shines. Our runtime-optimized implementations of data processing and computation algorithms make sure customers have real-time decision support. All our components communicate using Apache Kafka in a unique, event-driven architecture, rendering a highly scalable and flexible system, which aligns well with agile development. Combined with powerful container orchestration tools such as Docker Swarm and Kubernetes, infrastructure-as-code with Terraform and Ansible, and perfected with an in-house developed deployment file compiler, we can easily move our services between on-premise servers and public cloud providers.
We monitor our system continuously with alarms and visualizations. Prometheus and Grafana in combination grant us a plethora of metrics to supervise and enable us to take preventive action and make adjustments, avoiding downtime and long processing times.
This is just part of what makes up our PCAI Platform, which encompasses over 400 billion individual data points and processes over 120.000 messages each minute to continuously recompute statistics and deliver a secure and reliable system with over 99% uptime to customers all over the world.
The back-end is primarily written in Python, with runtime-critical algorithms implemented in Rust. The front-end is mainly made up of Typescript and React.
We pride ourselves in being efficient and flexible problem solvers, taking the eclectic selection of tasks that are thrown at us head-on. Our modular architecture enables us to manage separate parts of our system relatively independently, thus allowing us to adhere to a fail-fast strategy.
Our tasks often revolve around accommodating new needs and requirements both from clients, as well as our own data scientists, or designing creative solutions to unforeseen edge cases, all the while maintaining our generalizability. This offers a good balance between coding-intensive tasks and tasks that require a step back and a more abstract thought process.
We also understand that our brains are our biggest assets, and host weekly knowledge-sharing sessions to expand our collective skills. In these sessions, we take on a predetermined theme, but allow our minds to wander, and often let tangents and questions lead us into unknown territories which we explore together. Working with anything from the linux-kernel to react-components and machine learning algorithms, there’s an abundance of topics to immerse yourself in.