Wednesday, April 9, 2014

Pivotal HD Single-Node VM

source: http://pivotalhd.cfapps.io/getting-started/pivotalhd-vm.html

Pivotal HD Single-Node VM

System requirements

  • 8 GB RAM (The VM consumes 4 GB)
  • At least 30 GB free disk space
  • At least a dual-core CPU
  • NAT networking configured in the VM
  • 7z Archive utility

Pivotal HD VM contents

Pivotal HD VM is shipped with the following Pivotal products:
  • Pivotal HD 2.0 - Hadoop 2.0.5, Zookeeper, HBase, Hive, Pig, Mahout, Madlib
  • Pivotal HAWQ 1.1.4
  • Pivotal Extension Framework (PXF) 1.1.3
  • Pivotal GemFire XD 0.5 Beta
  • Pivotal Command Center 2.2
  • Product usage documentation
Other installed packages:
  • JDK 7
  • Spring Tool Suite (Eclipse-based IDE)
  • Maven
  • Git
  • Retail Demo data for tutorials
  • Sample Code

Credentials

The Pivotal HD Single-Node VM uses the following credentials:
CredentialUsernamePassword
Hadoopgpadminpassword
VM Root userrootpassword
DataLoader Web UIgpadminpassword
Command Center LogingpadminGpadmin1
sudogpadmin
user has partial sudo privileges
password
Notessudo is configured so that the gpadmin (default user for this VM) account can run commands via sudo without providing a password to access any of the other system accounts (mapred, hdfs, hadoop, etc.).

Scripts

You can use the following scripts to manage your Pivotal HD Single-Node VM:
  • start_all.sh – Starts all Hadoop services.
  • stop_all.sh – Stops all Hadoop services
  • start_gfxd.sh – Starts GemFire XD (Note that this script also stops some Hadoop services that are not needed for the GemFire Tutorials)
  • stop_gfxd.sh – Stops GemFire XD.

Installing the Pivotal HD Single-Node VM

  1. Note: This tutorial is directed towards users of the Pivotal HD Single Node VM version 2.0 and who received an early-access version of the VM. The instructions in this document will not work correctly with other versions. Version 2.0 of the VM will be available for download when PHD 2.0 is released.
  2. Enter the following command to extract the files using 7z:
    7za x ~/VMWare/PIVHDSNE_VMWARE_VM-2.0.0-40-2.7zs
    If you do not have a tool that can expand a 7z archive, you can download the 7z utility from: http://www.7-zip.org/
  3. Start the VM by double-clicking the .vmx file within the newly-created PIVHDSNE_VMWARE_VM-2.0.0-40 directory.
  4. After the VM starts, log in to the gpadmin user account using the password password (Note: this is also the root password.)
  5. In the VM, launch the Firefox Web browser to see the Command Center Web UI at http://localhost:5443/.
  6. Login to the Command Center with gpadmin as the username and Gpadmin1 as the password.
  7. Start Pivotal HD and HAWQ using the ~/Desktop/start_all.sh script
    Note: You can stop Pivotal HD and HAWQ using the ~/Desktop/stop_all.sh script

How to use this VM:

  1. Start the Hadoop services using the start_all.sh script located on the desktop.
  2. Follow the tutorials in the Setting up the Pivotal HD Tutorial section of this site.

Setting Up the Pivotal HD Tutorial

The Getting Started with Pivotal HD Tutorial tutorial provides a set of examples that demonstrate loading, querying, and running MapReduce Applications on a Pivotal HD 2.0 distribution. The tutorial is organized into several groups of examples, each demonstrating specific tasks.

Prerequisites

To run this tutorial, you use the Pivotal Single-Node Virtual Machine. This VM contains the Pivotal HD software, sample data, scripts, documentation, Java, Ant, Maven, and the Spring Tool Suite (STS) IDE. See Pivotal HD Single-Node VM.

Loading Sample Data

All of the examples use a common data set. The raw data is already included on the file system of the Single-Node VM. You load this data to run the examples in this tutorial.

Source code for the tutorials

The source code for the tutorials is available at https://github.com/gopivotal/pivotal-samples.git. This code is also included on the filesystem of the Pivotal HD Single-Node VM.

Running the Tutorial

The following examples demonstrate Pivotal HD features:

Map Reduce Examples

HAWQ Examples

Hive

Pig

No comments:

Post a Comment