How to achieve efficient data management in Managed RStudio

[This article was first published on RBlog – Mango Solutions, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

In a recent webinar, we provided an overview of our Managed RStudio platform and demonstrated how modern technology platforms like RStudio gives you the ability to collect, store and analyse data at the right time and in the right format to make informed business decisions.

The Public Health Evidence & Intelligence team at Herts Country Council demonstrated how they have benefitted significantly from the Managed RStudio – enabling collaborative development, empowerment and productivity at a time when they needed it most. In turn, they have been able to scale their department.

Many of the questions from the webinar focused on the governance and security aspects of Managed RStudio. In this blog, we’ve taken all your questions and have for further clarity attached a document that can help with any further questions regarding architecture, data management and maintenance.

Many of the questions asked were aligned to the management of data in the platform from the process of working with data on local drives, user interfaces to the management of large datasets.

There are several methods of getting data in and out of the Managed RStudio. These methods will largely depend on the type and size of the data involved.

For data science teams to work productively and deliver effective results for the business, the starting point is with the data itself. Data that is accurate, relevant, complete, timely and consistent are the key criteria against which data quality needs to be measured. Good data quality needs disciplined data governance, thorough management of incoming data, accurate requirement gathering, strict regression testing for change management and careful design of data pipelines. This is over and above data quality control programmes for data delivery from the outside and within.

Can you please elaborate on getting data into and out of the Managed RStudio platform?

Working with small data sets (< 100Mb)

For smaller data sets, we recommend using RStudio Workbench’s upload feature directly from the IDE. To do this, you can simply click on ‘upload’ in the ‘file’ panel. From here you can select any type of file from either your local hard disk, or a mapped network drive. The file will be uploaded to the current directory. You can also upload compressed files (zip),. which are automatically decompressed on completion. This means that you can upload much more than the 100Mb limit.

Working with large data sets (>100Mb)

For larger data sets or real-time data, we recommend using an external service such as CloudSQL or BigQuery (GCP), Azure SQL Database or Amazon RDS. These can be directly interfaced using R packages such as bigrquery,  RMariaDB or RMySQL.

For consuming real-time data, we recommend using either Cloud Pub/Sub or Azure Service Bus to create a messaging queue for R or python to read these messages.

Sharing data between RStudio Pro/Workbench, connect and other users

Data can easily be shared via ‘Pins’, allowing data to be published to Connect and shared with other users, across Shiny apps and RStudio.

Getting data out of Managed RStudio

As with upload, there are several methods to export data from Managed RStudio. RStudio Connect allows the publishing on Shiny Apps, Flask, Dash and Markdown. It also allows the scheduling of e-mail reports. For one-off analytics jobs, RStudio also allows you to download files directly from the IDE.

The Managed Service also allows uploading to any cloud service such as Cloud storage buckets.

Package Management

R Packages are managed and maintained by RStudio Package Manager giving the user complete control of which versions are installed.

RStudio Package Manager also allows the user to ‘snapshot’ a particular set of packages on a specific day to ensure consistency.

The solution to disciplined data governance

Data that is accurate, relevant, complete, timely and consistent are the key criteria against which data quality needs to be measured. Good data quality needs disciplined data governance and thorough management of incoming data, accurate requirements gathering, strict regression testing for change management and careful design of data pipelines. This all leads to better decisions based on data analysis but also ensures compliance with regulation.

As a Product Manager at Mango, Matt is passionate about data and delivering products where data is key to driving insights and decisions. With over 20 years experience in data consulting and product delivery, Matt has worked across a variety of industries including Retail, Financial Services and Gaming to help companies use data and analytical platforms to drive growth and increase value.

Matt is a strong believer that the combined value of the data and analytics is the key to success of data solutions.

The post How to achieve efficient data management in Managed RStudio appeared first on Mango Solutions.

To leave a comment for the author, please follow the link and comment on their blog: RBlog – Mango Solutions.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)