Hadoop is an open source programing framework developed by apache to process big data. It uses HDFS(Hadoop Distributed File System) to store the data across all the datanodes in the cluster in a distributive manner and mapreduce model to process the data.Install Hadoop Multinode ClusterNamenode (NN) is a master daemon which controls HDFS and Jobtracker (JT) is master daemon for mapreduce engine.RequirementsIn this tutorial I’m using two CentOS 6.3 VMs ‘master‘ and ‘node‘ viz. (master and node are my hostnames).
http://tecadmin.net/install-zabbix-agent-on-centos-rhel/Zabbix Agent is required to install on all remote systems needs to be monitor through Zabbix server. The Zabbix Agent collects resource utilization data and applications data on client system and provide such information to Zabbix server on there requests.There are two types of checks between Zabbix Server and Client.