Shiny meets Pavlovia Part Two

Using a Shiny-App to monitor your Pavlovia-Project

Luke Bölling https://psychfactors.org (FernUniversität Hagen (Department of General Psychology))https://www.fernuni-hagen.de/psychologie-urteilen-entscheiden/team/luke.boelling.shtml
08-10-2021

Intro

Welcome to part two of this “Shiny meets Pavlovia”-Series. After using an API-based approach to download data from the Pavlovia-Gitlab-Repository directly, we can now utilize this connection by incorporating it in a hosted Shiny app. At the end, we have a Experiment-Dashboard that gives us basic information about the progress of the experiment and e.g. allows us to download pre-processed experiment-data.

This tutorial isn’t a introduction into Shiny and its capabilities. You can get more information about Shiny in general here.

Let’s dive in.

The Shiny-App

Tl;dr: Here is a live-demo of the whole app hosted on Shinyapps.io: Live-Demo PW: example

Our Goal is a Shiny-App that allows us to:

I have used this kind of Shiny-Dashboard to give students access to their current data in a pre-processed form (merging and wrangling can be difficult with e.g. SPSS)

All you need are 150 lines of code and everything in datasetInput <- reactiveVal(\{...\}) you have already seen in my first blog-post.

Here is the app:

Show code
library(shiny)
library(stringr)
library(tidyverse)
library(httr)
library(jsonlite)
library(ggplot2)
library(shinyjs)
library(shinydashboard)
library(shinycssloaders)


# library(rsconnect)
# deployApp()


ui <- dashboardPage(
  # Application title
  dashboardHeader(title = "pavloviaShinyApp"),
  dashboardSidebar(
    column(12,
      align = "center", offset = 0,
      tags$style(".skin-blue .sidebar a { color: #444; }"),
      textInput("password", label = h3("Password"), placeholder = "Enter Password to get access to Data..."),

      # Button
      conditionalPanel(
        condition = "input.password == 'example'",
        downloadButton("downloadData", "Download", ) %>% withSpinner(color = "#0dc5c1")
      )
    )
  ),


  # Main panel for displaying outputs ----
  dashboardBody(
    useShinyjs(),
    fluidRow(conditionalPanel(
      condition = "input.password == 'example'",
      DT::dataTableOutput("dataOverview")
    )),
    fluidRow(conditionalPanel(
      condition = "input.password == 'example'",
      plotOutput("plotTest")
    ))
  )
)


# Define server logic
server <- function(input, output) {
  datasetInput <- reactiveVal({
    disable("downloadData")
    token <- read_file("token") # Personal Access Token for the Project
    project_id <- 149 # Project ID
    gitlabPavloviaURL <- paste0("https://gitlab.pavlovia.org/api/v4/projects/", project_id, "/repository/archive.zip") # API - URL to download whole repository
    r <- GET(gitlabPavloviaURL, add_headers("PRIVATE-TOKEN" = token)) # Getting Archive

    bin <- content(r, "raw") # Writing Binary

    temp <- tempfile() # Init Tempfile

    writeBin(bin, temp) # Write Binary of Archive to Tempfile

    listofFiles <- unzip(
      zipfile = temp, overwrite = T,
      junkpaths = T, list = T
    ) # Unzip only list of all files in the archive.zip file

    csvFiles <- grep("*.csv", x = listofFiles$Name, value = T) # Grep only the csv Files (Pattern can be extended to get only data-csv file)

    unzip(
      zipfile = temp, overwrite = T,
      junkpaths = T, files = csvFiles[1:100], exdir = "temp"
    ) # Unzip the csv Files in the temp-file

    csvFilesPaths <- list.files("temp/", full.names = T) # Get the unzipped csv-Files in the temp-directory

    # To get only Valid CSV-Files and enable us to filter by DateTime of the File we can parse the files standard date-time string in the Pavlovia-Default FileNames
    dateTimeOfFiles <- tibble(filepaths = csvFilesPaths) %>%
      mutate(dateTime = str_extract(filepaths, "[0-9]{4}-[0-9]{2}-[0-9]{2}_[0-9]{2}h[0-9]{2}")) %>%
      filter(!is.na(dateTime)) %>%
      mutate(dateTime = parse_datetime(dateTime, "%Y-%m-%d_%Hh%M"))
    # %>%  filter(dateTime > parse_datetime("2019-02-01_15h00", "%Y-%m-%d_%Hh%M")) # This can be used to Filter by a specific time

    # Purrr Magic  - Thanks to https://clauswilke.com/blog/2016/06/13/reading-and-combining-many-tidy-data-files-in-r/

    # Now the read the desired data Files with purrr:
    data <- data_frame(filename = dateTimeOfFiles$filepaths) %>% # create a data frame
      # holding the file names
      mutate(
        file_contents = map(
          filename, # read files into
          ~ read_csv(file.path(.))
        ) # a new data column
      )

    # Unlink temp because we don't need it anymore
    unlink("temp", recursive = T)
    disable("downloadData")
    data
  })



  output$plotTest <- renderPlot({
    ggplot(dataMerged(), aes(y = resp.rt, x = congruent), color = participant) +
      stat_summary(geom = "point", fun = "mean") +
      stat_summary(geom = "errorbar", fun.data = mean_se)
  })

  # Table of selected dataset ----
  output$dataOverview <- DT::renderDataTable({
    DT::datatable(datasetInput() %>%
      rowwise() %>%
      mutate(participant = list(file_contents$participant[1]), fileDim = paste0("Rows:", dim(file_contents)[1], " Vars:", dim(file_contents)[2])[1]) %>%
      select(-file_contents),
    options = list(scrollX = TRUE)
    )
  })

  observeEvent(input$password, {
    if (input$password != "example") hide("table") else show("table")
  })
  observeEvent(input$password, {
    req(datasetInput())
    if (input$password != "example") disable("downloadData") else enable("downloadData")
  })

  dataMerged <- reactive({
    # Read in all available data in a single tibble
    datasetInput() %>% select(file_contents) %>% # remove filenames, not needed anynmore
      unnest(cols = c(file_contents))
  })

  output$downloadData <- downloadHandler(
    filename = function() {
      paste("Test_Data", ".csv", sep = "")
    },
    content = function(file) {
      write.csv(dataMerged(), file, row.names = FALSE)
    }
  )
}

# Run the application
shinyApp(ui = ui, server = server)

You can get the whole app by forking the Github-Repository here: Github-Repository.

I encourage you to try out the demo code by yourself.

ATTENTION: You have to place a file named “token” into the directory with your own access-token from gitlab.pavlovia.org. See my previous blog-post for more information.

Here is a live-demo of the whole app hosted on Shinyapps.io:

Live-Demo PW: example

Blog post 3 will dive into the data-processing and interactive options you will get by using this Shiny-App.

Get the demo running

For this post my goal is to enable you to get this example-app running on your own shinyapps.io-Account.

Clone the Repository

Open app.R File

Use RStudio and any R-Version around 4.X to publish the Shiny App. If you have open the app.R-File you are ready to publish the example app to your account.

Publish on shinyapps.io

Just Use the Publish Button to get the App to shinyapps.io.

You will need to create a Shinyapps.io-Account and generate an Access-Token. RStudio is providing a very detailed explanation for the process.

At the end, make sure your “token”-File is correctly prepared and uploaded.

You’re done you have created a simple dashboard for the

Stroop-demo with a simple chart and in-depth information about the data-files. (Note: With ...junkpaths = T, files = csvFiles[1:100], exdir = "temp"... I only unpack the first 100 CSV-Files because this repo is huge)

Please feel free contact me, if you have any questions.

For Part 3 I am still looking for a collaborator to generate a flexible and interactive dashboard-app-starter-kit for Shiny-beginners.

Next Part