Parallel Language and Compiler
Nadar Saraswathi College of Arts and
Science
Presented by
S.Vijayalakshmi M.sc(IT)
INTRODUCTION
• The environment for parallel computer is much more demanding that for
sequential computers
• The programming environment is a collection of software support
• To break this hardware/software barrier, we need a parallel software
environment.
• Which provide better tools for user to implement parallelism ant to debug
programs
LANGUAGE FEATURES
• Features are idealized for general purpose application
• Some of the features are identical with existing language/compiler
development.
• Language features are classified in six categories
CATEGORIES
• Optimization features
• Availability features.
• Synchronization/communication features.
• Control of parallelism.
• Data parallelism features
• Process management features
OPTIMIZATION FEATURES
• This features converting sequentially coded programs into parallel forms.
• The purpose is to match the software with the hardware parallelism in the
target machine.
• Automated parallelized (alliant FX fortan)
• Semi-automed paralleliser (programmers interaction)
• Interactive restructure support (static analyzer, run time static , data flow
graph).
AVAILABLEITY FEATURES
• FEATURES THAT ENHANCE USER FRIENDLINESS. MAKE THE
LANGUAGE PORTABLE TO A LARGE CLASS OF PARALLEL COMPUTER.
• EXPAND THE APPLICATION OF THE SOFTWARE LIBRARIES.
• SCALABLITY- scalability to the number of processors available and independent
of the hardware topology
• COMPATIBLITY- compatibility with establishment sequential.
• PORTABILITY- portable to share memory multiprocessors, message passing
multicomputer, or both
SYNC/COMMUN FEATURES
• Single assignment languages
• Remote producer call
• Data flow languages such as ID
• Send/receive for message passing
• Barriers, mailbox, semaphores, monitors
CONTROL OF PARALLELISM
• Coarse, medium, or fine grains
• Explicit versus implicit parallelism
• Global parallelism.
• Take spilt parallelism
• Share task queue
DATA PARALLELISM FEATURES
• Used to specify how data are accessed and distributed in either SIMD and
MIMD computer.
• Run-time automatic decomposition
• Mapping specification
• Virtual processer support
• Direct access to share data
• SPMD(single program multiple data)
PROCESS MANAG FEATURES
• Needed to support the efficient creation of parallel processes ,
implementation of multithreading or multitasking.
• Dynamic process creation at run time
• Light weight processes(threads)-compare to UNIX(heavy weight)processes
• Replicated work
• Partitioned networks
• Automatic load balancing
COMPILERS
• Using high level language in source code.
• It’s become a necessity in modern computer
ROLE OF COMPILER
• Remove the burden of program optimization and code generation from the
programmer.
THREE PHASAS OF COMPILER
• FLOW ANALYSIS
• OPTIMIZATION
• CODE GENERATION
FLOW ANALYSIS
• Program flow pattern in order to determine data and control dependence in
the source code.
• Flow analysis is conducted at different execution levels on different parallel
computers.
• Instruction level parallelism is exploited I super scalar or VLSL processors
,loop level in SIMD, vector.
• Task level in multiprocessors, multicomputer, or network workstation.
OPTIMIZATION
• The transformation of user programs in order to explore the hardware capabilities
as much as possible
• Transformation can be conducted at the loop level , locality level , or prefetching
level
• The ultimate goal of PO is to maximize the speed of code execution
• It involves minimization of code length and of memory accesses and the
exploitation
• Sometimes should be conducted at the algorithmic level and must involves the
programmer.
CODE GENERATION
• Code generation usually involves transformation from one representation to
another called an intermediate form.
• Ever more demanding because parallel constructs must be included.
• Code generation closely tried to instruction scheduling policies used
• Optimized to encourage a high degree of parallelism
• Parallel code generation is very different to different computer classes they
are software hardware scheduled
Parallel language and compiler

Parallel language and compiler

  • 1.
    Parallel Language andCompiler Nadar Saraswathi College of Arts and Science Presented by S.Vijayalakshmi M.sc(IT)
  • 2.
    INTRODUCTION • The environmentfor parallel computer is much more demanding that for sequential computers • The programming environment is a collection of software support • To break this hardware/software barrier, we need a parallel software environment. • Which provide better tools for user to implement parallelism ant to debug programs
  • 3.
    LANGUAGE FEATURES • Featuresare idealized for general purpose application • Some of the features are identical with existing language/compiler development. • Language features are classified in six categories
  • 4.
    CATEGORIES • Optimization features •Availability features. • Synchronization/communication features. • Control of parallelism. • Data parallelism features • Process management features
  • 5.
    OPTIMIZATION FEATURES • Thisfeatures converting sequentially coded programs into parallel forms. • The purpose is to match the software with the hardware parallelism in the target machine. • Automated parallelized (alliant FX fortan) • Semi-automed paralleliser (programmers interaction) • Interactive restructure support (static analyzer, run time static , data flow graph).
  • 6.
    AVAILABLEITY FEATURES • FEATURESTHAT ENHANCE USER FRIENDLINESS. MAKE THE LANGUAGE PORTABLE TO A LARGE CLASS OF PARALLEL COMPUTER. • EXPAND THE APPLICATION OF THE SOFTWARE LIBRARIES. • SCALABLITY- scalability to the number of processors available and independent of the hardware topology • COMPATIBLITY- compatibility with establishment sequential. • PORTABILITY- portable to share memory multiprocessors, message passing multicomputer, or both
  • 7.
    SYNC/COMMUN FEATURES • Singleassignment languages • Remote producer call • Data flow languages such as ID • Send/receive for message passing • Barriers, mailbox, semaphores, monitors
  • 8.
    CONTROL OF PARALLELISM •Coarse, medium, or fine grains • Explicit versus implicit parallelism • Global parallelism. • Take spilt parallelism • Share task queue
  • 9.
    DATA PARALLELISM FEATURES •Used to specify how data are accessed and distributed in either SIMD and MIMD computer. • Run-time automatic decomposition • Mapping specification • Virtual processer support • Direct access to share data • SPMD(single program multiple data)
  • 10.
    PROCESS MANAG FEATURES •Needed to support the efficient creation of parallel processes , implementation of multithreading or multitasking. • Dynamic process creation at run time • Light weight processes(threads)-compare to UNIX(heavy weight)processes • Replicated work • Partitioned networks • Automatic load balancing
  • 11.
    COMPILERS • Using highlevel language in source code. • It’s become a necessity in modern computer ROLE OF COMPILER • Remove the burden of program optimization and code generation from the programmer.
  • 12.
    THREE PHASAS OFCOMPILER • FLOW ANALYSIS • OPTIMIZATION • CODE GENERATION
  • 13.
    FLOW ANALYSIS • Programflow pattern in order to determine data and control dependence in the source code. • Flow analysis is conducted at different execution levels on different parallel computers. • Instruction level parallelism is exploited I super scalar or VLSL processors ,loop level in SIMD, vector. • Task level in multiprocessors, multicomputer, or network workstation.
  • 14.
    OPTIMIZATION • The transformationof user programs in order to explore the hardware capabilities as much as possible • Transformation can be conducted at the loop level , locality level , or prefetching level • The ultimate goal of PO is to maximize the speed of code execution • It involves minimization of code length and of memory accesses and the exploitation • Sometimes should be conducted at the algorithmic level and must involves the programmer.
  • 15.
    CODE GENERATION • Codegeneration usually involves transformation from one representation to another called an intermediate form. • Ever more demanding because parallel constructs must be included. • Code generation closely tried to instruction scheduling policies used • Optimized to encourage a high degree of parallelism • Parallel code generation is very different to different computer classes they are software hardware scheduled