Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update to fparser 0 2 #373

Open
wants to merge 58 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
282f068
Merge pull request #19 from hiker/linker-lib-flags
hiker Sep 24, 2024
3b6e0bd
Support new and old style of PSyclone command line (no more nemo api …
hiker Sep 26, 2024
16d3ff5
Fix mypy errors.
hiker Sep 26, 2024
71fd1ae
Added missing tests for calling psyclone, and converting old style to…
hiker Sep 30, 2024
ec4c0f6
Updated comment.
hiker Sep 30, 2024
b9aabf8
Removed mixing, use a simple regex instead.
hiker Oct 17, 2024
8ee10e8
Added support for ifx/icx compiler as intel-llvm class.
hiker Oct 18, 2024
d7b2008
Added support for nvidia compiler.
hiker Oct 18, 2024
9005b3b
Add preliminary support for Cray compiler.
hiker Oct 18, 2024
8771e80
Added Cray compiler wrapper ftn and cc.
hiker Oct 18, 2024
0188050
Follow a more consistent naming scheme for crays, even though the nat…
hiker Oct 22, 2024
3c569bd
Changed names again.
hiker Oct 22, 2024
edc5fcd
Renamed cray compiler wrapper to be CrayCcWrapper and CrayFtnWrapper,…
hiker Nov 11, 2024
f6a70c8
Fixed incorrect name in comments.
hiker Nov 11, 2024
4f0e70f
Merge pull request #28 from hiker/additional_compilers
lukehoffmann Nov 11, 2024
58caecf
Merge branch 'compiler_wrapper' into bom_master
hiker Nov 11, 2024
5452d70
Merge branch 'linker-lib-flags' into additional_compilers
hiker Nov 12, 2024
d54d94c
Merge branch 'linker-lib-flags' into bom_master
hiker Nov 12, 2024
605e7e5
Merge branch 'additional_compilers' into bom_master
hiker Nov 12, 2024
70ad4b1
Merge branch 'linker-lib-flags' into additional_compilers
hiker Nov 12, 2024
b70f98f
Merge branch 'linker-lib-flags' into additional_compilers
hiker Nov 12, 2024
2f7e3ba
Merge branch 'linker-lib-flags' into additional_compilers
hiker Nov 12, 2024
7a2eb59
Additional compilers (#349)
hiker Nov 12, 2024
bd1d318
Merge branch 'dev' into additional_compilers
hiker Nov 12, 2024
2148fb7
Merge branch 'compiler_wrapper' into update_psyclone_to_support_next_…
hiker Nov 19, 2024
20fe928
Merge branch 'update_psyclone_to_support_next_release_syntax' into ad…
hiker Nov 19, 2024
f7b49e0
Merge branch 'linker-lib-flags' into additional_compilers
hiker Nov 21, 2024
a493c53
Support new and old style of PSyclone command line (no more nemo api …
hiker Sep 26, 2024
824851d
Fix mypy errors.
hiker Sep 26, 2024
16a125c
Added missing tests for calling psyclone, and converting old style to…
hiker Sep 30, 2024
fc19283
Added shell tool.
hiker Oct 23, 2024
730a824
Try to make mypy happy.
hiker Oct 23, 2024
6e280d9
Removed debug code.
hiker Oct 23, 2024
6c3f1c2
ToolRepository now only returns default that are available. Updated t…
hiker Oct 23, 2024
ae61d4a
Fixed typos and coding style.
hiker Nov 21, 2024
e7c2c83
Support new and old style of PSyclone command line (no more nemo api …
hiker Sep 26, 2024
e2051f2
Fix mypy errors.
hiker Sep 26, 2024
0ad85ee
Added missing tests for calling psyclone, and converting old style to…
hiker Sep 30, 2024
890b50d
Updated comment.
hiker Sep 30, 2024
032ab26
Fixed failing tests.
hiker Nov 21, 2024
7168f42
Merge branch 'additional_compilers_clean' into psyclone_3_support_clean
hiker Nov 21, 2024
70c083e
Merge branch 'psyclone_3_support_clean' into add_shell_tool_clean
hiker Nov 21, 2024
8753d0c
Updated fparser dependency to version 0.2.
hiker Nov 28, 2024
634d28c
Replace old code for handling sentinels with triggering this behaviou…
hiker Nov 29, 2024
78697bf
Fixed tests for latest changes.
hiker Nov 29, 2024
c82cedf
Removed invalid openmp continuation line - since now fparser fails wh…
hiker Nov 29, 2024
652db98
Added test for disabled openmp parsing. Updated test to work with new…
hiker Nov 29, 2024
ea7e428
Coding style changes.
hiker Nov 29, 2024
137d346
Fix flake issues.
hiker Nov 29, 2024
fa0cb5d
Fixed double _.
hiker Nov 29, 2024
63e77e5
Merge branch 'psyclone_3_support_clean' into add_shell_tool_clean
hiker Nov 29, 2024
810da77
Merge branch 'add_shell_tool_clean' into update_to_fparser_0_2
hiker Nov 29, 2024
bbdb380
Merge branch 'develop' into add_shell_tool_clean
hiker Dec 2, 2024
1335878
Merge branch 'add_shell_tool_clean' into update_to_fparser_0_2
hiker Dec 2, 2024
d032d8a
Merge branch 'develop' into update_to_fparser_0_2
hiker Jan 9, 2025
ccc8a39
Removed more accesses to private members.
hiker Jan 9, 2025
34f4985
Added missing type hint.
hiker Jan 9, 2025
686f990
Make flake8 happy.
hiker Jan 9, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ authors = [
license = {file = 'LICENSE.txt'}
dynamic = ['version', 'readme']
requires-python = '>=3.7, <4'
dependencies = ['fparser']
dependencies = ['fparser >= 0.2']
classifiers = [
'Development Status :: 1 - Planning',
'Environment :: Console',
Expand Down
42 changes: 24 additions & 18 deletions source/fab/parse/c.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,44 +11,48 @@
from pathlib import Path
from typing import List, Optional, Union, Tuple

from fab.dep_tree import AnalysedDependent

try:
import clang # type: ignore
import clang.cindex # type: ignore
except ImportError:
clang = None

from fab.build_config import BuildConfig
from fab.dep_tree import AnalysedDependent
from fab.util import log_or_dot, file_checksum

logger = logging.getLogger(__name__)


class AnalysedC(AnalysedDependent):
"""
An analysis result for a single C file, containing symbol definitions and dependencies.
An analysis result for a single C file, containing symbol definitions and
dependencies.

Note: We don't need to worry about compile order with pure C projects; we can compile all in one go.
However, with a *Fortran -> C -> Fortran* dependency chain, we do need to ensure that one Fortran file
is compiled before another, so this class must be part of the dependency tree analysis.
Note: We don't need to worry about compile order with pure C projects; we
can compile all in one go. However, with a *Fortran -> C -> Fortran*
dependency chain, we do need to ensure that one Fortran file is
compiled before another, so this class must be part of the
dependency tree analysis.

"""
# Note: This subclass adds nothing to it's parent, which provides everything it needs.
# We'd normally remove an irrelevant class like this but we want to keep the door open
# for filtering analysis results by type, rather than suffix.
pass
# Note: This subclass adds nothing to it's parent, which provides
# everything it needs. We'd normally remove an irrelevant class
# like this but we want to keep the door open for filtering
# analysis results by type, rather than suffix.


class CAnalyser(object):
class CAnalyser:
"""
Identify symbol definitions and dependencies in a C file.

"""

def __init__(self):
def __init__(self, config: BuildConfig):

# runtime
self._config = None
self._config = config
self._include_region: List[Tuple[int, str]] = []

# todo: simplifiy by passing in the file path instead of the analysed tokens?
def _locate_include_regions(self, trans_unit) -> None:
Expand Down Expand Up @@ -100,8 +104,7 @@ def _check_for_include(self, lineno) -> Optional[str]:
include_stack.pop()
if include_stack:
return include_stack[-1]
else:
return None
return None

def run(self, fpath: Path) \
-> Union[Tuple[AnalysedC, Path], Tuple[Exception, None]]:
Expand Down Expand Up @@ -149,9 +152,11 @@ def run(self, fpath: Path) \
continue
logger.debug('Considering node: %s', node.spelling)

if node.kind in {clang.cindex.CursorKind.FUNCTION_DECL, clang.cindex.CursorKind.VAR_DECL}:
if node.kind in {clang.cindex.CursorKind.FUNCTION_DECL,
clang.cindex.CursorKind.VAR_DECL}:
self._process_symbol_declaration(analysed_file, node, usr_symbols)
elif node.kind in {clang.cindex.CursorKind.CALL_EXPR, clang.cindex.CursorKind.DECL_REF_EXPR}:
elif node.kind in {clang.cindex.CursorKind.CALL_EXPR,
clang.cindex.CursorKind.DECL_REF_EXPR}:
self._process_symbol_dependency(analysed_file, node, usr_symbols)
except Exception as err:
logger.exception(f'error walking parsed nodes {fpath}')
Expand All @@ -166,7 +171,8 @@ def _process_symbol_declaration(self, analysed_file, node, usr_symbols):
if node.is_definition():
# only global symbols can be used by other files, not static symbols
if node.linkage == clang.cindex.LinkageKind.EXTERNAL:
# This should catch function definitions which are exposed to the rest of the application
# This should catch function definitions which are exposed to
# the rest of the application
logger.debug(' * Is defined in this file')
# todo: ignore if inside user pragmas?
analysed_file.add_symbol_def(node.spelling)
Expand Down
41 changes: 10 additions & 31 deletions source/fab/parse/fortran.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@
from pathlib import Path
from typing import Union, Optional, Iterable, Dict, Any, Set

from fparser.common.readfortran import FortranStringReader # type: ignore
from fparser.two.Fortran2003 import ( # type: ignore
Entity_Decl_List, Use_Stmt, Module_Stmt, Program_Stmt, Subroutine_Stmt, Function_Stmt, Language_Binding_Spec,
Char_Literal_Constant, Interface_Block, Name, Comment, Module, Call_Stmt, Derived_Type_Def, Derived_Type_Stmt,
Expand All @@ -21,6 +20,7 @@
from fparser.two.Fortran2008 import ( # type: ignore
Type_Declaration_Stmt, Attr_Spec_List)

from fab.build_config import BuildConfig
from fab.dep_tree import AnalysedDependent
from fab.parse.fortran_common import iter_content, _has_ancestor_type, _typed_child, FortranAnalyserBase
from fab.util import file_checksum, string_checksum
Expand Down Expand Up @@ -167,15 +167,21 @@ class FortranAnalyser(FortranAnalyserBase):
A build step which analyses a fortran file using fparser2, creating an :class:`~fab.dep_tree.AnalysedFortran`.

"""
def __init__(self, std=None, ignore_mod_deps: Optional[Iterable[str]] = None):
def __init__(self,
config: BuildConfig,
std: Optional[str] = None,
ignore_mod_deps: Optional[Iterable[str]] = None):
"""
:param config: The BuildConfig to use.
:param std:
The Fortran standard.
:param ignore_mod_deps:
Module names to ignore in use statements.

"""
super().__init__(result_class=AnalysedFortran, std=std)
super().__init__(config=config,
result_class=AnalysedFortran,
std=std)
self.ignore_mod_deps: Iterable[str] = list(ignore_mod_deps or [])
self.depends_on_comment_found = False

Expand Down Expand Up @@ -295,33 +301,6 @@ def _process_comment(self, analysed_file, obj):
# without .o means a fortran symbol
else:
analysed_file.add_symbol_dep(dep)
if comment[:2] == "!$":
# Check if it is a use statement with an OpenMP sentinel:
# Use fparser's string reader to discard potential comment
# TODO #327: once fparser supports reading the sentinels,
# this can be removed.
# fparser issue: https://github.com/stfc/fparser/issues/443
reader = FortranStringReader(comment[2:])
try:
line = reader.next()
except StopIteration:
# No other item, ignore
return
try:
# match returns a 5-tuple, the third one being the module name
module_name = Use_Stmt.match(line.strline)[2]
module_name = module_name.string
except Exception:
# Not a use statement in a sentinel, ignore:
return

# Register the module name
if module_name in self.ignore_mod_deps:
logger.debug(f"ignoring use of {module_name}")
return
if module_name.lower() not in self._intrinsic_modules:
# found a dependency on fortran
analysed_file.add_module_dep(module_name)

def _process_subroutine_or_function(self, analysed_file, fpath, obj):
# binding?
Expand Down Expand Up @@ -353,7 +332,7 @@ def _process_subroutine_or_function(self, analysed_file, fpath, obj):
analysed_file.add_symbol_def(name.string)


class FortranParserWorkaround(object):
class FortranParserWorkaround():
"""
Use this class to create a workaround when the third-party Fortran parser is unable to process a valid source file.

Expand Down
76 changes: 51 additions & 25 deletions source/fab/parse/fortran_common.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,14 @@
import logging
from abc import ABC, abstractmethod
from pathlib import Path
from typing import Union, Tuple, Type
from typing import Optional, Tuple, Type, Union

from fparser.common.readfortran import FortranFileReader # type: ignore
from fparser.two.parser import ParserFactory # type: ignore
from fparser.two.utils import FortranSyntaxError # type: ignore

from fab import FabException
from fab.build_config import BuildConfig
from fab.dep_tree import AnalysedDependent
from fab.parse import EmptySourceFile
from fab.util import log_or_dot, file_checksum
Expand Down Expand Up @@ -58,49 +59,61 @@ def _typed_child(parent, child_type: Type, must_exist=False):
# Look for a child of a certain type.
# Returns the child or None.
# Raises ValueError if more than one child of the given type is found.
children = list(filter(lambda child: isinstance(child, child_type), parent.children))
children = list(filter(lambda child: isinstance(child, child_type),
parent.children))
if len(children) > 1:
raise ValueError(f"too many children found of type {child_type}")

if children:
return children[0]

if must_exist:
raise FabException(f'Could not find child of type {child_type} in {parent}')
raise FabException(f'Could not find child of type {child_type} '
f'in {parent}')
return None


class FortranAnalyserBase(ABC):
"""
Base class for Fortran parse-tree analysers, e.g FortranAnalyser and X90Analyser.
Base class for Fortran parse-tree analysers, e.g FortranAnalyser and
X90Analyser.

"""
_intrinsic_modules = ['iso_fortran_env', 'iso_c_binding']

def __init__(self, result_class, std=None):
def __init__(self, config: BuildConfig,
result_class,
std: Optional[str] = None):
"""
:param config: The BuildConfig object.
:param result_class:
The type (class) of the analysis result. Defined by the subclass.
:param std:
The Fortran standard.

"""
self._config = config
self.result_class = result_class
self.f2008_parser = ParserFactory().create(std=std or "f2008")

# todo: this, and perhaps other runtime variables like it, might be better set at construction
# if we construct these objects at runtime instead...
# runtime, for child processes to read
self._config = None
@property
def config(self) -> BuildConfig:
'''Returns the BuildConfig to use.
'''
return self._config

def run(self, fpath: Path) \
-> Union[Tuple[AnalysedDependent, Path], Tuple[EmptySourceFile, None], Tuple[Exception, None]]:
-> Union[Tuple[AnalysedDependent, Path],
Tuple[EmptySourceFile, None],
Tuple[Exception, None]]:
"""
Parse the source file and record what we're interested in (subclass specific).
Parse the source file and record what we're interested in (subclass
specific).

Reloads previous analysis results if available.

Returns the analysis data and the result file where it was stored/loaded.
Returns the analysis data and the result file where it was
stored/loaded.

"""
# calculate the prebuild filename
Expand All @@ -114,9 +127,11 @@ def run(self, fpath: Path) \
# Load the result file into whatever result class we use.
loaded_result = self.result_class.load(analysis_fpath)
if loaded_result:
# This result might have been created by another user; their prebuild folder copied to ours.
# If so, the fpath in the result will *not* point to the file we eventually want to compile,
# it will point to the user's original file, somewhere else. So replace it with our own path.
# This result might have been created by another user; their
# prebuild folder copied to ours. If so, the fpath in the
# result will *not* point to the file we eventually want to
# compile, it will point to the user's original file,
# somewhere else. So replace it with our own path.
loaded_result.fpath = fpath
return loaded_result, analysis_fpath

Expand All @@ -125,43 +140,54 @@ def run(self, fpath: Path) \
# parse the file, get a node tree
node_tree = self._parse_file(fpath=fpath)
if isinstance(node_tree, Exception):
return Exception(f"error parsing file '{fpath}':\n{node_tree}"), None
return (Exception(f"error parsing file '{fpath}':\n{node_tree}"),
None)
if node_tree.content[0] is None:
logger.debug(f" empty tree found when parsing {fpath}")
# todo: If we don't save the empty result we'll keep analysing it every time!
# todo: If we don't save the empty result we'll keep analysing
# it every time!
return EmptySourceFile(fpath), None

# find things in the node tree
analysed_file = self.walk_nodes(fpath=fpath, file_hash=file_hash, node_tree=node_tree)
analysed_file = self.walk_nodes(fpath=fpath, file_hash=file_hash,
node_tree=node_tree)
analysed_file.save(analysis_fpath)

return analysed_file, analysis_fpath

def _get_analysis_fpath(self, fpath, file_hash) -> Path:
return Path(self._config.prebuild_folder / f'{fpath.stem}.{file_hash}.an')
return Path(self.config.prebuild_folder /
f'{fpath.stem}.{file_hash}.an')

def _parse_file(self, fpath):
"""Get a node tree from a fortran file."""
reader = FortranFileReader(str(fpath), ignore_comments=False)
reader.exit_on_error = False # don't call sys.exit, it messes up the multi-processing
reader = FortranFileReader(
str(fpath),
ignore_comments=False,
include_omp_conditional_lines=self.config.openmp)
# don't call sys.exit, it messes up the multi-processing
reader.exit_on_error = False

try:
tree = self.f2008_parser(reader)
return tree
except FortranSyntaxError as err:
# we can't return the FortranSyntaxError, it breaks multiprocessing!
# Don't return the FortranSyntaxError, it breaks multiprocessing!
logger.error(f"\nfparser raised a syntax error in {fpath}\n{err}")
return Exception(f"syntax error in {fpath}\n{err}")
except Exception as err:
logger.error(f"\nunhandled error '{type(err)}' in {fpath}\n{err}")
return Exception(f"unhandled error '{type(err)}' in {fpath}\n{err}")
return Exception(f"unhandled error '{type(err)}' in "
f"{fpath}\n{err}")

@abstractmethod
def walk_nodes(self, fpath, file_hash, node_tree) -> AnalysedDependent:
"""
Examine the nodes in the parse tree, recording things we're interested in.
Examine the nodes in the parse tree, recording things we're
interested in.

Return type depends on our subclass, and will be a subclass of AnalysedDependent.
Return type depends on our subclass, and will be a subclass of
AnalysedDependent.

"""
raise NotImplementedError
5 changes: 3 additions & 2 deletions source/fab/parse/x90.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
from fparser.two.Fortran2003 import Use_Stmt, Call_Stmt, Name, Only_List, Actual_Arg_Spec_List, Part_Ref # type: ignore

from fab.parse import AnalysedFile
from fab.build_config import BuildConfig
from fab.parse.fortran_common import FortranAnalyserBase, iter_content, logger, _typed_child
from fab.util import by_type

Expand Down Expand Up @@ -64,8 +65,8 @@ class X90Analyser(FortranAnalyserBase):
# Makes a parsable fortran version of x90.
# todo: Use hashing to reuse previous analysis results.

def __init__(self):
super().__init__(result_class=AnalysedX90)
def __init__(self, config: BuildConfig):
super().__init__(config=config, result_class=AnalysedX90)

def walk_nodes(self, fpath, file_hash, node_tree) -> AnalysedX90: # type: ignore

Expand Down
10 changes: 4 additions & 6 deletions source/fab/steps/analyse.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,8 +130,10 @@ def analyse(
unreferenced_deps = list(unreferenced_deps or [])

# todo: these seem more like functions
fortran_analyser = FortranAnalyser(std=std, ignore_mod_deps=ignore_mod_deps)
c_analyser = CAnalyser()
fortran_analyser = FortranAnalyser(config=config,
std=std,
ignore_mod_deps=ignore_mod_deps)
c_analyser = CAnalyser(config=config)

# Creates the *build_trees* artefact from the files in `self.source_getter`.

Expand All @@ -144,10 +146,6 @@ def analyse(
# - At this point we have a source tree for the entire source.
# - (Optionally) Extract a sub tree for every root symbol, if provided. For building executables.

# todo: code smell - refactor (in another PR to keep things small)
fortran_analyser._config = config
c_analyser._config = config

# parse
files: List[Path] = source_getter(config.artefact_store)
analysed_files = _parse_files(config, files=files, fortran_analyser=fortran_analyser, c_analyser=c_analyser)
Expand Down
Loading
Loading