Texas-based paññã is an AI digital recruitment platform that scientifically identifies
the best talent in the shortest time for a given skill set and avoids human bias in
the recruitment process.
paññã also enables the evaluation of all job applicants statistically and presents the top qualified candidates to hiring managers for in-person interviews or paññã Live interview sessions. It is a one-stop solution with applicant tracking and video interviewing solutions that help recruiters do their jobs easily and effectively.
While planning and developing paññã - AI powered digital interview platform, mroads technology team envisioned a fully cloud-based system with no on-premises infrastructure. The service needed to be highly reliable and work seamlessly.
At paññã, Video interviews are offered to clients on a pay-per-usage model. We were exploring services that help us maintain the availability of the service and autoscale as per predefined conditions. We also needed to find a cost-effective way to scale up and balance the traffic load among instances without overworking our engineering team. The idea is to scale down to minimal/near zero cost when the services are not in use and scale up instantly when 100s of interviews are being taken simultaneously. Auto scaling / Serverless (Lambda Architecture).
paññã being a video interview platform, Our engineers had to ensure it could manage high-volume video processing that can be accessed across multiple devices and browsers and needs to be transcoded in real-time (On demand). Transcoding should be done in all popular formats and should be easy, scalable and most importantly cost-effective. The platform should store the recorded video, play in-app videos across multiple devices and autoscale as per spike in the usage. Video S3 Storage is used for video recordings / FFMPEG Conversion for different codecs.
mroads chose to use a suite of AWS services for building paññã. The main pillar of the architecture is Amazon Elastic Compute Cloud (Amazon EC2), which provides secure, resizable compute capacity in the cloud. With a service-level agreement of 99.99 percent availability for each Amazon EC2 region, it provides high availability and uses Amazon Simple Storage Service (Amazon S3) to store a high volume of data. No matter how many interviews are taking place simultaneously on the platform, the services scale automatically and ensure users are not experiencing network latency. These AWS services enable paññã to easily scale up when the traffic is high and scale down which saves money.
The platform also consists of several microservices running on Amazon Elastic Compute Cloud (Amazon EC2) instances. Every microservice has its own purpose, such as on-demand video transcoding and processing of candidates' interview videos. The instances process huge quantities of data coming from the apps, and this data is stored in Amazon Simple Storage Service (Amazon S3). We are able to publish the interview report online as soon as the candidate finishes the interview for the interviewer to evaluate.
Amazon API Gateway, a fully managed secure API service, is used to iterate quickly and sustain growth, and Amazon Simple Email Service (Amazon SES) is used to automate the distribution of email notifications. To monitor system activity and to ensure a smooth user experience, paññã uses Amazon CloudWatch for operational visibility. In addition, mroads takes advantage of a dozen more AWS services, including Amazon Autoscaling, Amazon Route53, and Amazon Cloud formation.