Royston Posted March 25, 2012 Posted March 25, 2012 There are several questions embedded in this post, plus, I was unsure whether to put this in computer science...so maybe more appropriate there. For my dissertation, I'm looking at the effects of astrophysical jets within radio galaxies...there is a clear link between galaxy evolution and the role of jets (specifically star formation rates). I think to get a clearer understanding of such an environment, you need to model that environment. I've been looking at modelling environments. The first one I came across was RAMSES code, which is built on Fortran. So my first question is, I thought Fortran was more or less redundant, so is it worth trying to understand Fortran ? I thought C/C++ would be more far reaching..(I used to programme with machine code BTW, so learning isn't a huge issue) I looked around, and found MPI is doable through SLI set up's i.e you can use multiple GPU's and a CPU to boot. So I'm now running CODE::BLOCKS as an IDE, and have Cuda, which is a development tool to start programming. So, second question, has anyone had a good experience with Cuda, and is it a viable option for simulating astrophysical situations? It opens up multiple GPU use, which is clearly a good thing ? I am a newb to this area, any feedback is welcome
Klaynos Posted March 26, 2012 Posted March 26, 2012 Sorry about the slow reply. I'm not an astrophysicist, but I am a physicist so hopefully I can something here. I've only limited use of fortran but I know people who use it. I do know quite a bit about trying to do mathematical modelling and I have a pretty solid programming background from outside physics. There are several questions embedded in this post, plus, I was unsure whether to put this in computer science...so maybe more appropriate there. For my dissertation, I'm looking at the effects of astrophysical jets within radio galaxies...there is a clear link between galaxy evolution and the role of jets (specifically star formation rates). I think to get a clearer understanding of such an environment, you need to model that environment. A sensible approach I've been looking at modelling environments. The first one I came across was RAMSES code, which is built on Fortran. So my first question is, I thought Fortran was more or less redundant, so is it worth trying to understand Fortran ? There are a few things to remember when talking about Fortran and whether it is redundant. Good modelling codes are often written by postdocs. When computing modelling was taking off in a serious way Fortran was THE method for solving mathematical problems. These postdocs are now lectures and professors, they do not like letting go of a good thing. I thought C/C++ would be more far reaching..(I used to programme with machine code BTW, so learning isn't a huge issue) C (and C++) is a **** when trying to program any complicated mathematics. As an undergrad I had to do a programming course, I found it pretty easy. The final thing we had to do was develop our own program to do something physics related. Most of my contemporaries wrote pretty simple classical mechanics things. I chose to write a graph plotting and fitting code for reflection measurments to find the magneto optical constant... This required complex numbers, have you ever tried to do complex numbers with a traditional programming language? They are a ****. At this point I was already familiar with R (http://www.r-project.org/), which in concept is similar to Fortran, its design purpose is for statistical and mathematical computer, it works with imaginary numbers, matrices etc out of the box. If I'd tried to write all of my codes in C it'd have taken me many orders of magnitude more coding time than I have done. I looked around, and found MPI is doable through SLI set up's i.e you can use multiple GPU's and a CPU to boot. So I'm now running CODE::BLOCKS as an IDE, and have Cuda, which is a development tool to start programming. I've only used MPI on multiple cores on one linux machine, saved a lot of time (this was running the Meep modelling code which is actually written in C++). Seeme d to work ok, you've got to be a bit careful as often with older codes they are not designed to run that way and you'll actually find that using a single core with a faster clock speed will be faster. That's probably not such an issue over SLI though as I suspect it's more compatible with parallel computting. So, second question, has anyone had a good experience with Cuda, and is it a viable option for simulating astrophysical situations? It opens up multiple GPU use, which is clearly a good thing ? I am a newb to this area, any feedback is welcome No experience I'm afraid.
Royston Posted March 28, 2012 Author Posted March 28, 2012 (edited) Thanks for the detailed response, Klaynos There are a few things to remember when talking about Fortran and whether it is redundant. Good modelling codes are often written by postdocs. When computing modelling was taking off in a serious way Fortran was THE method for solving mathematical problems. These postdocs are now lectures and professors, they do not like letting go of a good thing. That makes sense. There's no way, with my current workload, I'll be learning an entire language. However, when I do choose a programming course, once my degree is finished, I just wanted to make sure I was making the right choice before taking the plunge, so... C (and C++) is a **** when trying to program any complicated mathematics. ...is the type of information I need. At this point I was already familiar with R (http://www.r-project.org/), which in concept is similar to Fortran, its design purpose is for statistical and mathematical computer, it works with imaginary numbers, matrices etc out of the box. If I'd tried to write all of my codes in C it'd have taken me many orders of magnitude more coding time than I have done. It's funny you mention R, I was going to use this in an astrophysics project last year for statistics (can't remember why I changed my mind). Though you've raised a good point, i.e if somebody has already done the leg work, then build from there. IOW, looking into modifications of environments that have already been developed. Albeit I'd still need to fully understand the language, but can gloss over any complications e.g complex numbers, like you said. I've only used MPI on multiple cores on one linux machine, saved a lot of time (this was running the Meep modelling code which is actually written in C++). Seeme d to work ok, you've got to be a bit careful as often with older codes they are not designed to run that way and you'll actually find that using a single core with a faster clock speed will be faster. That's probably not such an issue over SLI though as I suspect it's more compatible with parallel computting. With your final point, that's precisely why I was looking at CUDA. It's, for me, a case of what computing power is accessible for a home set-up. So, this seemed like an obvious choice, especially as GPU's are massively multi core. So I'll have to look at platforms that have been developed for CUDA, that take away the pain of having to code complicated maths in C/C++. No experience I'm afraid. No problem, you've narrowed down what I should be looking towards, which is the main thing. EDIT: Yikes, I mistakenly pressed the minus button on the post rating. I'm not really into that rep stuff, but if a mod can reverse that please, thanks. Edited March 28, 2012 by Royston
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now