Sysop: | Amessyroom |
---|---|
Location: | Fayetteville, NC |
Users: | 23 |
Nodes: | 6 (0 / 6) |
Uptime: | 56:53:35 |
Calls: | 584 |
Calls today: | 1 |
Files: | 1,139 |
D/L today: |
179 files (27,921K bytes) |
Messages: | 112,134 |
Steven G. Kargl wrote:
On Wed, 16 Apr 2025 07:44:08 +0000, Lawrence D'Oliveiro wrote:
On Tue, 15 Apr 2025 18:28:31 -0500, Lynn McGuire wrote:
On 4/15/2025 6:14 PM, Lawrence D'Oliveiro wrote:
On Mon, 14 Apr 2025 23:50:32 -0500, Lynn McGuire wrote:
Got rid of a few nasty bugs like:
iword = 6Habcdef
Surely whether thatrCOs a bug or not would depend on the type of rCLiwordrCY
...
iword is a implicit 4 byte integer capable of storing 4 characters.
I thought you got rid of all the implicit typing.
Implicit typing has nothing do with numeric storage size.
program foo
use iso_fortran_env, only : numeric_storage_size
integer :: j = 0
i = 6Habcdef ! i has an implicit type of default integer kind
j = 6Habcdef ! j has an explicit type of default integer kind
print *, i, j
print *, numeric_storage_size
end program foo
I believe you'll find that some of that syntax did not exist under
Fortran 77, in particular the "use" line.
Did Lynn convert to F90 first?
On 11/23/2022 5:36 PM, Tran Quoc Viet wrote:
On Saturday, November 19, 2022 at 1:01:25 PM UTC+7, Lynn McGuire wrote:
We are converting a 700,000+ Fortran 77 lines of code plus 50,000+ C++
lines of code engineering software product to C++. With all that code,
In comp.lang.fortran Lynn McGuire <lynnmcguire5@gmail.com> wrote:
On 11/23/2022 5:36 PM, Tran Quoc Viet wrote:
On Saturday, November 19, 2022 at 1:01:25 PM UTC+7, Lynn McGuire wrote: >>>> We are converting a 700,000+ Fortran 77 lines of code plus 50,000+ C++ >>>> lines of code engineering software product to C++. With all that code,
Mentioned recently on Hacker News:
https://www.lanl.gov/media/publications/1663/0125-llm-translation
"...O???Malley is taking open-source LLMs, running them on Lab computers, and plying the models with a technique called retrieval-augmented generation (RAG), where generative models are enhanced with data from external
sources. The idea is to train the models to translate from Fortran to C++ using what???s known as few-shot learning..."