Title: Hadoop as an Infrastructure for Cloud and Big Data Software Development: (2017/02/21)
Hadoop is an infrastructure for distributed data management and parallel processing that is executed on a cluster of physical/virtual computers. In this workshop,after getting familiar with concept of HPC/HTC computing,we also pay attention to MapReduce Programming model and HDFS as well. Researchers and students whose study is related to this field will know how to install Hadoop,create a MapReduce program,and how to run YARN as resource manager in Hadoop. This workshop would be useful for students and researchers in HPC/HTC computing,Cloud Computing,Big Data,New Operating Systems,Real-time
Data Mining and Machine learning,Image Processing,Data Analysts,etc.
Workshops on HPC/HTC Computing:
Title: CPU/GPU CUDA Programming:(2017/02/22)
Nowadays,by increasing the amount of data and calculations there is a serious need to parallel and accelerated processing in a variety of sciences. CUDA is a very famous platform for parallel processing. It allows us to use GPU processors along with CPUs to increase the computational power. Using CUDA,we are able to develop high performance applications to produce big data without requiring to Cloud and Big Data Infrastructures. This fundamental workshop would be useful for students and researchers in HPC/HTC computing,Big Data,New Operating Systems,Real-time Data Mining and Machine learning,Image Processing,Data Analysts,etc.
Title: Soft Computing-Techniques,Tools and its Applications in Engineering
Conventional computing (hard computing) is a collection of methods based on crisp systems,binary logic,numerical analysis and exact algorithms. These methods have often a lot of computation time and require enough knowledge solving a problem. In contrast,soft computing is a group of techniques based on fuzzy logic,artificial neural networks,evolutionary computing,machine learning,and probabilistic reasoning. The latter can be solved real life problems by exploiting uncertainty and partial truth. In complex and NP-hard problems,hard computing technics are not efficient to solve the problems because the search space grows exponentially with the problem size,making an exhaustive search impractical. In this case,finding the best solution takes a long time. Due to the fact that soft computing techniques are able to achieve a good solution in reasonable time,they have been have been successfully applied in the various problems in last two decades.
The aim of this workshop is to familiarize participants with methods and tools of soft computing such as metaheuristic algorithms,artificial neural networks,fuzzy logic and hybrid methods. This workshop is useful for all researchers who are interested to artificial intelligent and its applications in various fields.
Millions of engineers and scientists worldwide use MATLAB® to analyze and design the systems. The MATLAB platform is optimized for solving engineering and scientific problems. The matrix-based MATLAB language is the world’s most natural way to express computational mathematics. Built-in graphics make it easy to visualize and gain insights from data. A vast library of prebuilt toolboxes lets you get started right away with algorithms essential to your domain. In this workshop the environment of MATLAB will be introduced; and some methods for expressing computational mathematics,making m-files,using loops and input/output commands,drawing plots and working with help will be thought. It is used for students and researchers in machine learning,signal processing,image processing,computer vision,communications,computational finance,control design,robotics,and much more.