[ANSWERED] What get_iplayer activities can run concurrently?
Are there dangers to running multiple instances of get-iplayer in different processes simultaneously? For example, running a refresh in one process and a download in another? Or running a TV refresh and a radio refresh in separate processes at the same time? etc.
Concurrent downloads would be OK. Concurrent TV/radio refreshes would not be OK (nor would that make sense). TV and radio refreshes in separate processes would be OK, but so little time would be saved it wouldn't be worth it. Concurrent downloads and refreshes would be OK, but concurrent searches and refreshes would not. After initial installation, refreshes are relatively quick, so there is no reason not to have an updated cache before performing searches or launching any downloads. I suggest you control refreshes in your application and use --expiry with a very large value for every download to avoid an automatic refresh ever kicking off.
Great, thank you. I've made a helpful (for me at least, maybe to others too!) summary table below. I had to make a few assumptions, but it sounds to me like the basic rule is: "don't read from or write to a cache file if there's a chance it might be written to at the same time".

Thanks also for the --expiry tip. I'll do that.

|            | Dnload | Refr TV | Refr Radio | Srch TV | Srch Radio
| Dnload     |   N    |   Y     |     Y      |   Y     |     Y    
| Refr TV    |   Y    |   N     |     Y      |   N     |     Y    
| Refr Radio |   Y    |   Y     |     N      |   Y     |     N    
| Srch TV    |   Y    |   N     |     Y      |   Y     |     Y    
| Srch Radio |   Y    |   Y     |     N      |   Y     |     Y      
Concurrent downloads have a potential problem I forgot about: With concurrent downloads, there is a small risk that multiple processes could attempt to write to the download history file at the same time. The download itself will have completed by that point. However, get_iplayer will die if it can't open the history file, so you may need to catch that (exit code 11) and re-run get_iplayer with --mark-downloaded to update the history.

In the end, you may not want to bother with concurrent downloads. Depending on your upstream connection and the CDN load, get_iplayer can easily use up your available bandwidth with a single download.
Thanks. In that case I'll queue downloads to avoid them running simultaneously. I'd like to support simultaneous download and refresh because I don't think it makes sense to users if a refresh is delayed until whatever's currently downloading finishes. From what you say, this should be okay, so the current situation will work for what I need.

I've edited the table to reflect "no simultaneous downloads".
I've run the webpvr process with up to 6 downloads at the same time - no big issues. The BBC seems to be able to manage and I have 200mb download speed which I can use.

Thanks Alan for sharing your experience. My current code has a sequential download queue, but I might change it to allow simultaneous downloads, as you've described. In practice, I'm not sure it'll make a huge amount of difference, but from what you and @dinky have said, the chance of a problem is very small (and even then, recoverable).
I'm going to retract my earlier retraction. Concurrent downloads should be OK.  Although get_iplayer will die if it can't open the history file, that wouldn't be related to accessing it from multiple processes. The writes to the download history file should essentially be atomic if the entry is smaller than the I/O buffer size and the file is opened in append mode, which would always be the case. That wasn't always true, but it was fixed quite a while back. Many users, including me, have been doing concurrent downloads for a long time without ill effects.

EDIT: But as I said before, concurrent downloads are worth the bother only if you can fill just a fraction of your available bandwidth with a single download.
@dinky, @alansh, thank you both. That's certainly useful general info to know. As I mentioned above, I'm currently doing only sequential downloads, but I might try concurrent in the future to see how it goes. Since I'm working on mobile, the processor/disk activity may be problematic with multiple simultaneous downloads (and especially, simultaneous ffmpeg conversions). But that's just a question of performance.