T-mobile uses R for Customer Service AI
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
T-Mobile, the global telecommunication company, is using R in production to automatically classify text messages to customer service and route them to an agent that can help. The AI@T-mobile team used the keras library in R to build a natural language processing engine with Tensorflow, and deployed it to production as a docker container. The MRAN Time Machine ensures the container gets fixed R package versions for reproducibility.
While you may think first of Python for building production AI models, the keras library in R has already proven to be a first-class interface for building AI models. The T-mobile team was able to build this system in just four months and with a small budget because they were able to explore in R and immediately deploy in R:
“Despite being an incredibly popular language for exploratory analysis, data scientists are repeatedly told that R is not sufficient for machine learning – especially if those ML models are destined for production. Though much exploratory analysis and modeling is done in R, these models must be rebuilt in Python to meet DevOps and enterprise requirements. Our team doesn’t see the value in this double work. We explore in R and immediately deploy in R. Our APIs are neural network models using R and TensorFlow in docker containers that are small and maintainable enough to make our DevOps team happy!”
If you'd like to try this out yourself, the T-mobile team has published an open source version of their message-classification container on Github, and you can read the details in the blog post by team leads at the link below.
T-Mobile blog: Enterprise Web Services with Neural Networks Using R and TensorFlow
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.