automatically add function to function array in R

r

57 просмотра

3 ответа

all the functions receives the same argument, I want to run all the functions on a same data, so I created manually list of functions:

foo_1 <- function(data){
  .
  .
}

foo_2 <- function(data){
  .
  .
}

foo_3 <- function(data){
  .
  .
}

than: funcs <- c(foo_1, foo_2, foo_3)

the downside of this approach is if I create a new function, say foo_4 i need to manually add this function to funcs vector. is there any way to this kind of think automatically ?

Автор: d_e Источник Размещён: 08.11.2019 11:06

Ответы (3)


2 плюса

Решение

Let us first put your functions into a file from which to source them back in ...

# creating the source file for the functions 

funcs_text <- 
"
foo_1 <- function(data){}
foo_2 <- function(data){}
foo_3 <- function(data){}
"

fname <- tempfile()
writeLines(funcs_text, fname)

Now we can read in the file via source(). The trick is to first create an new environment (new.env()) where the functions defined in the file are sourced into. This environment is passed to source() via the local argument. Thereafter we transform the environment to an list - et voila.

# reading in file
funcs <- new.env() 
source(fname, local=funcs)
funcs <- as.list(funcs)

Now you can further use your list of functions to do your processing.

# accessing a function

funcs[[1]]
funcs$foo_1
funcs[["foo_1"]]


# calling a function

funcs[[1]]()
funcs$foo_1()
funcs[["foo_1"]]()
Автор: petermeissner Размещён: 20.08.2016 09:55

2 плюса

You could go beyond source() and actually make a package for these functions, that way you get the benefit of having rich documentation for them—i.e. :

#' Multiply x by 2
#'
#' Spiffy long description here
#'
#' @param x the data
#' @return atomic numeric value
#' @export
f1 <- function(x) { x * 2 }

#' Divide x by 2
#'
#' Spiffy long description here
#'
#' @param x the data
#' @return atomic numeric value
#' @export
f2 <- function(x) { x / 2 }

#' Subtract 3 from x
#'
#' Spiffy long description here
#'
#' @param x the data
#' @return atomic numeric value
#' @export
f3 <- function(x) { x - 3 }

Put those in and build a package called myfuncs then you can do something like:

library(myfuncs)
library(purrr)

lsf.str("package:myfuncs") %>% 
  invoke_map_dbl(list(x=100))

You can just use invoke_map() if you don't care about type safety or are returning more complex objects than what purrr supports out of the box.

You can add non-exported functions to that package as well if you ever need supporting functions for your exposed functions.

Автор: hrbrmstr Размещён: 20.08.2016 10:24

1 плюс

By your example, your functions are enumerated and with a pattern. Then you can use a combination of grep and lsf.str. We finally use get to get the function calls in a list.

foo_1 <- function(data){
    data*2
}
foo_2 <- function(data){
    data*3
}
bar <- function(data){
    data+22
}

## Vector output of functions that match string
funcs_names <- lsf.str()[grep("foo_",lsf.str())]  

## Get function calls
funcs <- sapply(funcs_names,get)
funcs
## $foo_1
## function (data) 
## {
##     data * 2
## }
## 
## $foo_2
## function (data) 
## {
##     data * 3
## }
Автор: Therkel Размещён: 20.08.2016 10:07
Вопросы из категории :
32x32