Analysing comments to “Star Wars: The Last Jedi” – part 2
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
As already mentioned in my first post I also analysed the user comments from a post at www.starwars-union.de word by word. The figure shows the ‘wordcloud’ from all comments (1728 until now).
To create such a nice wordcloud, I used the following code. The first part was already explained in my first post.
First I load all neccesarry packages and I create all available URLs to the comments.
This is the main part for scraping all the comments: I searched the HTML file for id=”kommentargesamt” and extract the comments. These are saved in the variable comments.
Now all is prepared for creating the wordcloud. For that purpose I used the following snippet, which I found once in the internet. There are many examples creating a wordcloud with R and I decided to use the following one:
To create a nice graphical output I recommend to save the wordcloud directly and not via RStudio viewer or something else.
And that’s it !! I think most of the words are comprehensible also for non-german readers 😉
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.