Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems running example.ALM: linked files, core dumped at refineHexMesh and decomposePar #26

Open
hcOnel opened this issue Sep 20, 2018 · 10 comments

Comments

@hcOnel
Copy link

hcOnel commented Sep 20, 2018

I have the example.ABL.flatTerrain.stable case solved as precursor and trying to run example.ALM upon it. I am at the preprocess stage at the moment.
First thing I couldn't get is the drivingData files: boundaryData and sources. These are supposed to be linked but I do not have them in the precursor directory.

# Get the inflow data linked up.
echo "Linking the inflow data with this directory..."
ln -s $precursorDir/drivingData/boundaryData ./constant

# Get the driving source terms linked up.
echo "Linking the driving source terms with this directory..."
ln -s $precursorDir/drivingData/sources ./constant

Also I get the following errors while running runscript.preprocess:

Using refineHexMesh to perform  2  of local refinement...
   -Performing level 2 local refinement with topoSet/refineHexMesh
      *selecting cells to refine...
      *refining cells...
./runscript.preprocess: line 38: 12643 Aborted                 (core dumped) refineHexMesh local -overwrite > log.refineHexMesh.local.$i 2>&1
   -Performing level 1 local refinement with topoSet/refineHexMesh
      *selecting cells to refine...
      *refining cells...
./runscript.preprocess: line 38: 12646 Aborted                 (core dumped) refineHexMesh local -overwrite > log.refineHexMesh.local.$i 2>&1
Using decomposePar to decompose the problem for parallel processing...
./runscript.preprocess: line 172: 12648 Aborted                 (core dumped) decomposePar -cellDist -force -time $startTime > log.decomposePar 2>&1
Running checkMesh to report mesh diagnostics...
Using refineMesh to perform  0  of global refinement...
Using changeDictionary to ensure that the boundaries have the correct number of faces...
Using renumberMesh to renumber the mesh for better matrix conditioning...
Running checkMesh to report mesh diagnostics...

refineHexMesh and decomposePar logs show the error:

--> FOAM FATAL ERROR: 
Attempt to cast type patch to type lduInterface

    From function refCast<To>(From&)
    in file /home/canon/OpenFOAM/OpenFOAM-2.4.x/src/OpenFOAM/lnInclude/typeInfo.H at line 114.

FOAM aborting

The type patch is as it is in the example case. Should it be changed?

@Bartdoekemeijer
Copy link
Contributor

Here are the basic instructions for running and coupling a precursor to SOWFA windPlantSolver simulations.

Generating the precursor data

  1. Run your precursor for a long period of time so that turbulent structures can arise, and then let a quasi-equilibrium form. This is typically 20,000 seconds. You do not need to output anything during these simulations. An example of the controlDict is given in the attachment controlDict.1.
  2. Update your controlDict to include the function “sampling/boundaryDataPre”. (for example: controlDict.2). You can also add other sampling functions for visualization, but this is optional. In the boundaryDataPre file, you can select which domain side to record. Since the inflow is typically from the west, the default option is to record the west domain patch.
  3. Continue/restart your simulation from 20,000 seconds to 22,000 seconds using this new controlDict. This will generate you 2,000 seconds of precursor data. You should see a folder called boundaryDataPre in your postProcessing folder.

Preparing the precursor data
4. In the postProcessing folder, you will see a folder called SourceHistory and a folder called boundaryDataPre.
5. Copy the contents from “SOWFA\tools\boundaryDataConversion” to the postProcessing folder, and run the script makeBoundaryDataFiles.west.sh (note: you may have to change the first line of makeBoundaryDataFiles/data.py and of makeBoundaryDataFiles/points.py to fit your cluster). This will generate a folder called boundaryData containing the inflow data for the 2,000 seconds of simulation with correct formatting. This usually takes about 30 minutes on our cluster.
6. Copy the contents from “SOWFA\tools\sourceDataConversion” to the postProcessing folder, and run the script sourceHistoryRead.py. This will create a file called sources in your postProcessing folder.
7. Create a new folder in your main case directory (i.e., next to the folders constant and system) called drivingData. Copy the file sources and the folder boundaryData to drivingData.
8. In your main case directory, gather the state information from the various processors at time = 20,000 using OpenFOAM. This will be your initial condition for the wind farm simulation. Use this command:

reconstructPar -time 20000 -fields '(k kappat nuSgs p_rgh qwall Rwall T U)'

Coupling to a wind farm simulation
9. In your wind farm simulation folder, open the file runscript.preprocess. Then update the precursorDir to the directory of your precursor case. This folder should contain drivingData (containing the sources file and the boundaryData folder) and the folder 20000 (containing your initial conditions). Also, set startTime=20000, and make sure this matches with your controlDict.
10. Setup the simulation settings in setUp. Run runscript.preprocess and then submit (qsub) a HPC job for runscript.solve.1.

@hcOnel
Copy link
Author

hcOnel commented Sep 21, 2018

Mr. Doekemeijer, thanks for your detailed explanations. These are extremely helpful for new users like me.
I have followed these steps and although I'm having a hard time understanding the definition and transformation of flow direction and boundaries at different stages of the simulation, I can't ask for your time to explain these right now and I will keep trying to figure it out myself (watching the 2h video in NREL website).
Right now, I am trying to run a 500s + 500s precursor case (instead of 20000s + 2000s) not for simulation's sake but just to see if I am getting errors or not.
As prescribed in the example case, flow direction is +x (eastwards) so I will be sampling the west patch.

  1. After step 6, the log is:
Number of vertices =  2821
Number of faces (original) =  2610


x_min =  -0.1
x_max =  0.1
y_min =  0.0
y_max =  10000.0
z_min =  0.0
z_max =  1000.1

but in boundaryData/west/$timeStep directory, k, T and U files all show the average as 0, i.e;

// Average
(0 0 0)

2610
(
(4.08173838431 0.127017697574 0.00932212747655)
(4.06631740709 0.121126538405 0.0112868540655)
(4.13315167556 0.111598777211 0.00616209385615)
...
  1. Stranger thing is, at pimpleFoam run of wind farm simulation, I get the following error:
Starting time loop

<U_1> = (6.02094373982 0.0466931066434 -0.00018940097415)	<U_s> = (3.43129072862 0.103011340048 0)	<dU/dn> = (0.0761662650351 -0.00165641862954 -4.44682329569e-05)
[0] 
[0] 
[0] --> FOAM FATAL ERROR: 
[0] Cannot find starting sampling values for current time 500
Have sampling values for times 
162
(
503.105590062
506.211180124
509.316770186
512.422360248
...

and I couldn't manage to output that first 500 time directory. After the 2nd run of precursor case, the boundaryDataPre directory does not include 500 folder in the first place. My runscript.solve.2 and controlDict.2 files are set to startTime=500. Should I set it slightly before 500?

Again, thanks for your most valuable input.
Huseyin

@ewquon
Copy link
Collaborator

ewquon commented Sep 21, 2018 via email

@MerleSieb
Copy link

Hi everyone,
thanks a lot for the great help provided here.
I was wondering: if I am running a precursor case of let's say 3km x 3km x 1km and afterwards would like to run a case (winplantsolver.alm) with only 1km x 1 km x 1km, (for example for less windturbines) is this possible with the same precursor input? what do I have to adapt to be able to use the input? Or would I at least have to run the 2000 input seconds for the exact mesh I want to use with help of mapfields beforehand?
The same question occurs regarding the refinement of the mesh. What if the "inlet side" is refined? Will the driving data be mapped accordingly?

Thanks a lot in advance and kind regards

Merle

@Bartdoekemeijer
Copy link
Contributor

Bartdoekemeijer commented Jul 2, 2019

@MerleSieb Perhaps it is possible, but I have never seen it. The way it is set up now, is that SOWFA copies the base mesh from the precursor directory, and then performs refinements. I'm guessing "cropping" the mesh would be more of an OpenFOAM question rather than a SOWFA question.

Also, I believe the inlet is linearly mapped onto the mesh (both spatially and in time).

@MerleSieb
Copy link

Thanks a lot for the reply and all help provided.

Is there any possibility of "re-using" precursor data for another wind velocity for the exact same mesh?
I guess simple scaling wouldn't really be helpful/working?! But is there any way to generate a reasonable windfield with similar (doesn't have to be exactly the same of course) properties without re-running the precursor from the beginning? In my case I am talking about transferring the results from a 12m/s run to 10m/s (at 90m height, for neutral condition and low surface roughness)

@MerleSieb
Copy link

I have got yet another question:
Is there a tool/script to export one of the binary full-field files used in FAST? I thought about comparing the "standalone" FAST results to ALM SOWFA simulations. Instead of running long and several time seeds I might be helpful to have quite similar inflow conditions during one single simulation.
I guess it should not be too difficult to write a script to "transform" the data in the right format or am I missing something? As I am neither experienced nor good in scripting/coding I thought maybe somebody did the same already or has some tips how to proceed.
Thank you for any suggestion.

@MerleSieb
Copy link

Regarding my last question I found the scripts in the following links:
https://github.com/NWTC/datatools/blob/master/utilities/ensight_planes_to_hawc.py
https://github.com/NWTC/datatools/blob/master/utilities/structuredVTK_planes_to_hawc.py

Nevertheless, I am not sure how to use them and what to use them on. I saw there is a function foamToEnsight (as well as foamToVTK) available but what data do I have to do that with? The drivingData? Or do I have to use another sampling function during the simulation specifying the format, etc. What directory do I have to excecute the above scripts from?

Thanks a lot in advance

@ewquon
Copy link
Collaborator

ewquon commented Sep 30, 2019

@MerleSieb sorry for the delayed response. These depend on planes of data, i.e., surface sampling output from OpenFOAM either in the Ensight format or the structured VTK format. The writer for the latter is provided by libSOWFAfileFormats.so. Hopefully you got it figured out.

@ghost
Copy link

ghost commented Feb 19, 2020

refineHexMesh and decomposePar logs show the error:

--> FOAM FATAL ERROR: 
Attempt to cast type patch to type lduInterface

    From function refCast<To>(From&)
    in file /home/canon/OpenFOAM/OpenFOAM-2.4.x/src/OpenFOAM/lnInclude/typeInfo.H at line 114.

FOAM aborting

The type patch is as it is in the example case. Should it be changed?

I have also encountered this problem and solved it. It seems that you have reconstructed all the variables while in the changeDictionarydict only k kappat nuSgs p_rgh qwall Rwall T U are there. So after performing changeDictionary, the other variables are still having cyclic conditions, which will confuse refineHexMesh and decomposePar.

Just keep the 7 variable when doing reconstructPar or rewrite the changeDictionarydict for other variables.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants