Using UQLab with Python - how to circumvent 2GB data transfer limit?

Hello everyone,

I would like to use UQLab to conduct sensitivity studies on an aircraft design library I wrote in Python. As a starting point, I implemented a simple Cotter analysis with 5 input variables, implying 12 model evaluations in total.

My Python model (version 3.9) is treated as a black box function, which is executed from MATLAB as follows (NOTE: pyenv(ExecutionMode = “OutOfProcess”)):

pyrunfile(strcat(" ", input_variables), output_variable);


" " is the master script managing my Python model
input_variables is a JSON string of model inputs
output_variable is a single float used for the Cotter analysis

Halfway through the 11th model evaluation, MATLAB returns a Python package-specific error (“Python Error: TypeError: Numerical inputs to HAPropsSI must be ints, floats, lists, or 1D numpy arrays.” - essentially a fluid property library which gets passed an invalid input). Now, when I only run the 11th model evaluation I don’t get an error, and the results of the individual evaluation are identical to those from the full set of evaluations up to the point where the latter failed. Moreover, when I run all 12 evaluations on a subset of my model (less computationally/memory expensive), I don’t get an error either (the Cotter indices are displayed graphically). After repeating the aforementioned steps multiple times, the error message always occurs in the exact same location, halfway through the 11th model evaluation.

All this suggests to me that I am exceeding the 2GB transfer limit between MATLAB and Python (Limitations to Python Support - MATLAB & Simulink - MathWorks United Kingdom). I tried setting

myAnalysis = uq_createAnalysis(SOpts)
myAnalysis.SaveEvaluations = 0

as per Table 3 of ‘UserManual_Sensitivity’ on the UQLab website in the hope that would reduce the data volume, but without success.

If anyone has encountered a similar issue and might know a solution, I would be immensely thankful.

Best wishes,

Hi @nmb29

Have you tried running Python in the same process as Matlab? You can do this by setting pyenv(ExecutionMode='InProcess'). As stated in the link you shared, this 2GB is the limit for data transfer between the two processes. So it may work if you run everything in the same process.

Let me know if this does not work. There are still other (but less efficient and clean) solutions that may work.

Best regards

Dear Styfen,

Thank you for your prompt response. I’ve tried your suggestion, but unfortunately my code still fails in the same location. On top of that, MATLAB tends to freeze when running Python ‘InProcess’, so I had to write the outputs to a text file to track progress.

I have opened a ticket with MATLAB customer support because it seems a MATLAB- rather than UQLab-specific problem, however, I would still be keen to hear about the other solutions you have in mind.

Best wishes and many thanks in advance,


It seems you are interested in sensitivity analysis for complex simulations. However sensitivity methods such as Sobol’ indices are usually applied to a limited set of relevant quantities of interest post-processed from your simulation.
Only this information should be transferred between Matlab and python in your case, and not the full simulation data for each run.

Also, Cotter method is implemented in UQlab for historical reasons but it’s not used anymore due to its lack of interpretation in complex cases.
With maybe 20 runs of your model with 5 inputs you could already get a decent sparse polynomial chaos expansion and related sensitivity analysis.
In this context, if your simulation is so large because you export full displacement or stresses field over a refined mesh, you should use model reduction techniques such as principal component analysis to compress these fields, before doing sensitivity.

Best regards

Dear Professor Sudret,

Thank you for suggesting PCE-based Sobol’ indices. I have implemented the method and it works fine!

Best wishes,