Yes! You can do that in mgcv
This is the page for a talk I gave at a “Workshop on Ecological and Environmental Statistics” at Lancaster University, September 1113 2023.
If you just want the talk slides click here. If you want the talk slides source, click here.
Below is a perslide set of references, in case that’s useful to anyone…
Slide  References 

It’s me! You can find me on the web on the BioSS website and on my personal website.  
Unfortunately due to the slow death of twitter dot com, I can’t find Jenny’s original tweet any more :(  
mgcv ’s page on CRAN. Simon’s homepage 

For a real introduction to GAMs, I’d recommend Simon’s book (which is also a good (technical) introduction to GLMs, GLMMs and more). Also useful are two online courses, one from Noam Ross here and another from Gavin Simpson.  
These things are wiggly and we’re going to talk about wigglyness (or wiggliness). This is a Real Maths Word no matter what anyone says (I mean, we all know about the monster group and this is no more silly a name than that).  
These penalties can take various different forms based on the problem we’re interested in. Sometimes they are discrete (per Eilers and Marx). Eric Pedersen has a nice Shiny app to illustrate penalty action on a spline.  
These are all different smoothers we can use in mgcv . The source for the presentation shows how they were generated, mostly using code from the mgcv manual. Top row, left to right: 1D smoother, 2D isotropic smoother (thin plate regression spline, see ?smooth.construct.tp.smooth.spec ), perspective plot view of a tensor product (?te ) 2D smoother. Bottom row, left to right: the soap film smoother (?soap and Wood et al (2008)), smoothing on the sphere (?smooth.construct.sos.smooth.spec ), a cyclic smoother (?smooth.construct.cs.smooth.spec ). 

Figure here is from Jacobson et al (2022), where we used Markov random fields to model beaked whale response to navy sonar. Gavin Simpson has a nice post on Markov random fields in mgcv . You can find out more about fitting random effects and Gaussian processes by looking at their respective manual pages: ?random.effects and ?smooth.construct.gp.smooth.spec . 

The original paper on stochastic partial differential equation smoothing with INLA is Lindgren et al (2011. We wrote a paper on how to fit these models into mgcv , Miller et al (2020) with associated code on github. 

Hastie and Tibshirani (1993) is the first reference to varying coefficient models. Thorson et al (2023) provide a nice review and applications in fisheries.  
Figure from Pedersen et al (2019), where we describe all these models and how to fit them in mgcv . 

Again figure from Pedersen et al (2019). Some applications in Miller et al (2021) and Mannocci et al (2020).  
We’re working on a paper to introduce these ideas at the moment, which we hope you’ll find useful!  
The aforementioned paper will also cover these models too. These are inspired by the work of Gasparrini (2011) (and references therein).  
Fitting Poisson processes via GLMtype methods is old news, dating back at least to Berman and Turner (1992). This is also basically the same idea that Simpson et al (2016) use to fit their models in RINLA/inlabru . I started writing a package to fit these models a while ago and it needs a bit of love but basically works, you can find it here. 

This is a new feature that I requested for mgcv , you can try this out by looking at the examples in ?gfam . 

You can find out more about ordered catagorical response by looking at the ?ocat manual page. 

bam is the fitting function to use for these models. There is detail on the methods in Wood et al. (2016). 

So much is hidden in the mgcv manual, just typing help.start() in R and lookingup the mgcv manual can get you somewhere interesting. I’d also suggest looking at the ?gam.models page, as there are some true gems there. 

You can get into the Bayesian understanding of smoothing with a paper I wrote and the references therein.  
See the paper above for more information. Also for interface with Stan there is brms for JAGS there is the builtin jagam which can be easily adapted to Nimble too. Finally for TMB, I wrote mgcvminusminus which implements most common models but is not complete. 

These organisations really helped fund my time to understand all the things that I presented in this talk. For space reasons, BioSS and UKCEH aren’t here but should be recognised!!  
As you can see from the above, there are lots of things in mgcv that you can do to fit a wide range of models. For those times you can fit it in, there are ways to extend via commonlyused fitting frameworks (MCMC or not) that can help you go from something you understand to something more fancy. 