Page not found – ShopingServer Wiki https://wiki.shopingserver.com Tutorials and Articles About Technology and Gadgets Wed, 02 Sep 2020 02:34:08 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.14 https://wiki.shopingserver.com/wp-content/uploads/2018/07/cropped-favicon-150x150.png Page not found – ShopingServer Wiki https://wiki.shopingserver.com 32 32 SSH Into Google Cloud Compute Engine Instance Using Secure Shell Client https://wiki.shopingserver.com/ssh-google-cloud-compute-engine-instance-using-secure-shell-client/ https://wiki.shopingserver.com/ssh-google-cloud-compute-engine-instance-using-secure-shell-client/#respond Sat, 06 Jan 2018 09:42:41 +0000 http://wiki.shopingserver.com/?p=18547 I

need to set and test my web-app using Google cloud compute engine. How do I connect to an Instance Using ssh on Ubuntu Linux or Apple OS X based system?

 

By default, you can always connect to an instance using ssh. This is useful so you can manage and configure your instances beyond the basic configuration enabled by gcutil or the REST API. The easiest way to ssh into an instance is to use gcutil command from your local Linux / OS X based systems. The following steps are required

Install gcutil/google sdk

Authorize instance

Verify instance status

Create ssh keys

Connect using gcutil or ssh client

Step #1: Install gcutil

gcutil runs on UNIX-based operating systems such as Linux and Mac OS X. To use gcutil, you must have Python 2.6.x or 2.7.x installed on your computer. gcutil does not support Python 3.x. Python is installed by default on most Linux distributions and Mac OS X. Open the Terminal and type the following command or to grab gcutil tool visiting this url.

Debian / Ubuntu / RHEL / CentOS Linux/OS X UNIX user type the following commands:

Open a terminal and type:

Download IT ##

$ wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz

$ tar -zxvf google-cloud-sdk.tar.gz

INSTALL IT ##

$ bash google-cloud-sdk/install.sh

 

Sample outputs:

Welcome to the Google Cloud SDK!

 

The Google Cloud SDK is currently in developer preview. To help improve the

quality of this product, we collect anonymized data on how the SDK is used.

You may choose to opt out of this collection now (by choosing  N  at the below

prompt), or at any time in the future by running the following command:

gcloud config set –scope=user disable_usage_reporting true

 

Do you want to help improve the Google Cloud SDK (Y/n)?  n

 

 

This will install all the core command line tools necessary for working with

the Google Cloud Platform.

 

 

The following components will be installed:

——————————————————————————————-

| BigQuery Command Line Tool                                        |     2.0.18 | < 1 MB |

| BigQuery Command Line Tool (Platform Specific)                    |     2.0.18 | < 1 MB |

| Cloud DNS Admin Command Line Interface                            | 2015.04.29 | < 1 MB |

| Cloud SDK Core Command Line Tools                                 |          1 |        |

| Cloud SDK Core Libraries (Platform Specific)                      | 2014.10.20 | < 1 MB |

| Cloud SQL Admin Command Line Interface                            | 2015.04.09 | < 1 MB |

| Cloud Storage Command Line Tool                                   |       4.12 | 2.5 MB |

| Cloud Storage Command Line Tool (Platform Specific)               |        4.6 | < 1 MB |

| Compute Engine Command Line Interface                             | 2015.04.29 | < 1 MB |

| Compute Engine Command Line Tool (deprecated)                     |     1.16.5 | < 1 MB |

| Compute Engine Command Line Tool (deprecated) (Platform Specific) |     1.16.5 | < 1 MB |

| Default set of gcloud commands                                    | 2015.04.29 | < 1 MB |

| Native extensions for gcloud commands (Mac OS X, x86_64)          |     0.15.0 | 4.0 MB |

——————————————————————————————-

 

|- Creating update staging area                             -|

|============================================================|

 

|- Installing: BigQuery Command Line Tool                   -|

|============================================================|

|- Installing: BigQuery Command Line Tool (Platform Spec… -|

|============================================================|

|- Installing: Cloud DNS Admin Command Line Interface       -|

|============================================================|

|- Installing: Cloud SDK Core Command Line Tools            -|

|============================================================|

|- Installing: Cloud SDK Core Libraries (Platform Specific) -|

|============================================================|

|- Installing: Cloud SQL Admin Command Line Interface       -|

|============================================================|

|- Installing: Cloud Storage Command Line Tool              -|

|============================================================|

|- Installing: Cloud Storage Command Line Tool (Platform… -|

|============================================================|

|- Installing: Compute Engine Command Line Interface        -|

|============================================================|

|- Installing: Compute Engine Command Line Tool (depreca… -|

|============================================================|

|- Installing: Compute Engine Command Line Tool (depreca… -|

|============================================================|

|- Installing: Default set of gcloud commands               -|

|============================================================|

|- Installing: Native extensions for gcloud commands (Ma… -|

|============================================================|

 

Creating backup and activating new installation…

 

Update done!

Modify profile to update your $PATH and enable bash completion? (Y/n)?  y

 

The Google Cloud SDK installer will now prompt you to update an rc

file to bring the Google Cloud CLIs into your environment.

 

Enter path to an rc file to update, or leave blank to use

[/Users/veryv/.bash_profile]:

Backing up [/Users/veryv/.bash_profile] to [/Users/veryv/.bash_profile.backup].

[/Users/veryv/.bash_profile] has been updated.

Start a new shell for the changes to take effect.

See how to install gcutil tool to manage Google Compute Engine on Linux / Unix for more information.

Step #2: Authenticating to Google Compute Engine

The syntax is:

gcloud auth login

gcloud auth login –project=YOUR-PROJECT-ID-HERE

If your project id is “apache-cluster”, enter:

gcloud auth –project=apache-cluster

Sample outputs:

Fig.01: Authenticating to Google Compute Engine using gcutil command

 

Open a web browser, and go to the specified URL. Click the Grant Access link. The page will display an authorization code. Copy this code. Paste the authorization code into the waiting gcutil auth terminal and press enter. Type the following command to cache project-id:

gcloud config set project YOUR-PROJECT-ID-HERE

Step #3: Verify instance status

Type the following command:

$ gcloud compute instances list

 

Sample outputs:

NAME       ZONE         MACHINE_TYPE INTERNAL_IP   EXTERNAL_IP     STATUS

instance-1 asia-east1-c f1-micro     10.240.xx.yyy 104.155.xxx.zzz RUNNING

Note: instance-1 instance running in asia-east1-c zone.

Step #4: Create ssh keys

The syntax is:

gcloud compute ssh instance_name_here

gcloud compute ssh USER@instance_name_here

gcloud compute ssh USER@instance_name_here — arg1 arg2

gcloud compute –project PROJECT_ID_HERE ssh instance_name_here

In this example, connect to db1 instance using ssh:

$ gcloud compute ssh db1

WARNING: Consider passing  –zone=us-central1-a  to avoid the unnecessary zone lookup which requires extra API calls.

INFO: Zone for db1 detected as us-central1-a.

WARNING: You don t have an ssh key for Google Compute Engine. Creating one now…

Enter passphrase (empty for no passphrase): TYPE-YOUR-PASSPHRASE-HERE

Enter same passphrase again: TYPE-YOUR-PASSPHRASE-HERE

INFO: Updated project with new ssh key. It can take several minutes for the instance to pick up the key.

INFO: Waiting 300 seconds before attempting to connect.

gcutil creates local files to store your public and private key, and copies your public key to the project. By default, gcutil stores ssh keys in the following files on your local system:

$HOME/.ssh/google_compute_engine – Your private key

$HOME/.ssh/google_compute_engine.pub – Your public key

Step #5: Connect using gcutil or ssh client

The syntax is:

gcloud compute ssh instance_name_here

gcloud compute ssh USER@instance_name_here –zone NOZE_NAME_HERE

OR

ssh -o UserKnownHostsFile=/dev/null -o CheckHostIP=no -o StrictHostKeyChecking=no -i $HOME/.ssh/google_compute_engine -A -p 22 $USER@TYPE-GOOGLE-COMPUTE-ENGINE-PUBLIC-IP-HERE

In this example, connect to the ‘instance-1’ instance using gcloud tool:

gcloud compute ssh instance-1

Sample outputs:

For the following instances:

– [instance-1]

choose a zone:

[1] asia-east1-c

[2] asia-east1-a

[3] asia-east1-b

[4] europe-west1-d

[5] europe-west1-c

[6] europe-west1-b

[7] us-central1-c

[8] us-central1-b

[9] us-central1-a

[10] us-central1-f

Please enter your numeric choice:  1

 

Warning: Permanently added  104.155.xxx.zzz  (RSA) to the list of known hosts.

[vivek@instance-1 ~]$

In this example, connect to the db1 (public ip 1.2.3.4) instance using ssh command:

ssh -o UserKnownHostsFile=/dev/null -o CheckHostIP=no -o StrictHostKeyChecking=no -i $HOME/.ssh/google_compute_engine -A -p 22 vivek@1.2.3.4

Sample sessions:

Fig.02: Connecting to an Instance Using ssh

To SSH into ‘db3’ in zone asia-east1-c, run:

gcloud compute ssh db3 –zone asia-east1-c

You can also run a command on the virtual machine. For example, to get a snapshot of the guest’s process tree, run:

gcloud compute ssh db3 –zone asia-east1-c –command  ps -ejH

If you are using the Google container virtual machine image, you can SSH into one of your containers with:

gcloud compute ssh db3 –zone asia-east1-c –container CONTAINER

How do I login as root user?

For security reasons, the standard Google do not provide the ability to ssh in directly as root. The instance creator and any users that were added using the –authorized_ssh_keys flag or the metadata sshKeys value are automatically administrators to the account, with the ability to run sudo without requiring a password. Type the following command to switch to root user:

sudo -s

Sample session:

Fig.03: Root Access and Instance Administrators using the ‘sudo -s’ command on Google compute instance

Optional: Update your gcloud tools

Type the following command:

gcloud components update

Sample outputs:

The following components will be updated:

———————————————————————–

| BigQuery Command Line Tool                    |     2.0.18 | < 1 MB |

| Cloud DNS Admin Command Line Interface        | 2015.04.29 | < 1 MB |

| Cloud SDK Core Libraries                      | 2015.04.29 | 1.8 MB |

| Cloud SDK Core Libraries (Platform Specific)  | 2014.10.20 | < 1 MB |

| Cloud SQL Admin Command Line Interface        | 2015.04.09 | < 1 MB |

| Cloud Storage Command Line Tool               |       4.12 | 2.5 MB |

| Compute Engine Command Line Interface         | 2015.04.29 | < 1 MB |

| Compute Engine Command Line Tool (deprecated) |     1.16.5 | < 1 MB |

———————————————————————–

The following components will be installed:

———————————————————————————-

| Default set of gcloud commands                           | 2015.04.29 | < 1 MB |

| Native extensions for gcloud commands (Mac OS X, x86_64) |     0.15.0 | 4.0 MB |

———————————————————————————-

 

Do you want to continue (Y/n)?  y

 

Creating update staging area…

 

Uninstalling: BigQuery Command Line Tool … Done

Uninstalling: Cloud DNS Admin Command Line Interface … Done

Uninstalling: Cloud SDK Core Libraries … Done

Uninstalling: Cloud SDK Core Libraries (Platform Specific) … Done

Uninstalling: Cloud SQL Admin Command Line Interface … Done

Uninstalling: Cloud Storage Command Line Tool … Done

Uninstalling: Compute Engine Command Line Interface … Done

Uninstalling: Compute Engine Command Line Tool (deprecated) … Done

 

Installing: BigQuery Command Line Tool … Done

Installing: Cloud DNS Admin Command Line Interface … Done

Installing: Cloud SDK Core Libraries … Done

Installing: Cloud SDK Core Libraries (Platform Specific) … Done

Installing: Cloud SQL Admin Command Line Interface … Done

Installing: Cloud Storage Command Line Tool … Done

Installing: Compute Engine Command Line Interface … Done

Installing: Compute Engine Command Line Tool (deprecated) … Done

Installing: Default set of gcloud commands … Done

Installing: Native extensions for gcloud commands (Mac OS X, x86_64) … Done

 

Creating backup and activating new installation…

 

Done!

References:

$HOME/.bash_profile file example.

Google Compute Engine documentation.

Man pages: ssh(1),bash(1)

 

 

]]>
https://wiki.shopingserver.com/ssh-google-cloud-compute-engine-instance-using-secure-shell-client/feed/ 0
How To: Update gcutil / gcloud Components On a Linux / Unix / OS X https://wiki.shopingserver.com/update-gcutil-gcloud-components-linux-unix-os-x/ https://wiki.shopingserver.com/update-gcutil-gcloud-components-linux-unix-os-x/#respond Sat, 06 Jan 2018 08:23:45 +0000 http://wiki.shopingserver.com/?p=18455 H

ow can I update for Google Cloud SDK components on a Linux or OS X unix based systems?

 

The easiest and recommended way is to use gcloud command to update various Google cloud SDK libs, commands and other components. Open the Terminal and type the following command when you see the following message on screen:

There are available updates for some Cloud SDK components

The command to update gcloud components is as follows:

gcloud components update

Sample outputs:

The following components will be removed:

———————————————————————-

| Big Query Command Line Tool                  |     2.0.17 | 1.3 MB |

| Cloud SDK Core Libraries                     | 2013.12.06 | < 1 MB |

| Cloud SDK Core Libraries (Platform Specific) | 2013.11.19 | < 1 MB |

| Cloud Storage Command Line Tool              |       3.38 | 1.7 MB |

| Compute Engine Command Line Tool             |     1.12.0 | < 1 MB |

———————————————————————-

The following components will be installed:

——————————————————————————

| Big Query Command Line Tool                          |     2.0.17 | < 1 MB |

| Big Query Command Line Tool (Platform Specific)      |     2.0.17 | < 1 MB |

| Cloud SDK Core Libraries                             | 2014.01.27 | < 1 MB |

| Cloud SDK Core Libraries (Platform Specific)         | 2014.01.27 | < 1 MB |

| Cloud SQL Admin Command Line Interface               | 2014.01.28 | < 1 MB |

| Cloud Storage Command Line Tool                      |       3.42 | 1.8 MB |

| Cloud Storage Command Line Tool (Platform Specific)  |       3.42 | < 1 MB |

| Compute Engine Command Line Tool                     |     1.13.0 | < 1 MB |

| Compute Engine Command Line Tool (Platform Specific) |     1.13.0 | < 1 MB |

——————————————————————————

 

Do you want to continue (Y/n)?  y

 

Creating update staging area…

 

Uninstalling: Big Query Command Line Tool … Done

Uninstalling: Cloud SDK Core Libraries … Done

Uninstalling: Cloud SDK Core Libraries (Platform Specific) … Done

Uninstalling: Cloud Storage Command Line Tool … Done

Uninstalling: Compute Engine Command Line Tool … Done

 

Installing: Big Query Command Line Tool … Done

Installing: Big Query Command Line Tool (Platform Specific) … Done

Installing: Cloud SDK Core Libraries … Done

Installing: Cloud SDK Core Libraries (Platform Specific) … Done

Installing: Cloud SQL Admin Command Line Interface … Done

Installing: Cloud Storage Command Line Tool … Done

Installing: Cloud Storage Command Line Tool (Platform Specific) … Done

Installing: Compute Engine Command Line Tool … Done

Installing: Compute Engine Command Line Tool (Platform Specific) … Done

 

Creating backup and activating new installation…

 

Done!

Please note that when you update SDK, you may end up losing credentials for the active account. To activate existing user account, use the following command

$ gcloud auth login

 

Just follow on screen instructions. Use the following command to ssh into Google cloud host:

$ gcutil ssh your-compute-name-here

$ gcutil ssh db2

 

 

]]>
https://wiki.shopingserver.com/update-gcutil-gcloud-components-linux-unix-os-x/feed/ 0
How To Install Google Chrome 62 On a RHEL/CentOS 7 and Fedora Linux 26 Using Yum Command https://wiki.shopingserver.com/install-google-chrome-62-rhel-centos-7-fedora-linux-26-using-yum-command/ https://wiki.shopingserver.com/install-google-chrome-62-rhel-centos-7-fedora-linux-26-using-yum-command/#respond Fri, 05 Jan 2018 14:41:18 +0000 http://wiki.shopingserver.com/?p=18264 H

ow do I installed the latest version of Google Chrome v45 on a Red Hat Enterprise Linux or CentOS Linux version 7.x and Fedora Linux v22/23/24/25/26 using the yum command line option?

 

Google Chrome is a browser that combines a minimal design with sophisticated technology to make the web faster, safer, and easier. You can install it on any Linux distro including CentOS, RHEL, and Fedora Linux.

Find out if your Linux distro is a 32bit or 64 bit

Type the following command Linux kernel and distro is running in 32bit or 64bit mode:

echo  You are using $(getconf LONG_BIT) bit Linux distro.

Outputs:

You are using 64 bit Linux distro.

OR try:

$ uname -m

x86_64

Procedure to install Google Chrome 62 on a RHEL/CentOS/Fedora Linux:

Here is how to install and use the Google Chrome 62 in five easy steps:

Open the Terminal application. Grab 64bit Google Chrome.

Type the following command to download 64 bit version of Google Chrome:

wget https://dl.google.com/linux/direct/google-chrome-stable_current_x86_64.rpm

Install Google Chrome and its dependencies on a CentOS/RHEL, type:

sudo yum install ./google-chrome-stable_current_*.rpm

Start Google Chrome from the CLI:

google-chrome &

Sample outputs from yum command:

Fig.01 Installing Chrome Web Browser Using Yum Command

 

Sample session:

Fig.02: About Google Chrome Version Number

 

Google Chrome 62 running on my Fedora Linux desktop:

Google Chrome 62 in action on a Fedora Linux 26

Please note that these instructions always installs the latest version of Google Chrome on a CentOS/RHEL/Fedora Linux.

A note about Fedora Linux v24.x/25.x/26.x users

Type the following dnf command:

$ sudo dnf install google-chrome-stable_current_*.rpm

 

Sample outputs:

Fig.03: Installing Google Chrome on a Fedora using dnf command

A note about Google chrome repo file

The above procedure creates /etc/yum.repos.d/google-chrome.repo file as follows. This is useful to automatically update your Google Chrome version:

$ cat /etc/yum.repos.d/google-chrome.repo

 

Sample outputs:

[google-chrome]

name=google-chrome

baseurl=http://dl.google.com/linux/chrome/rpm/stable/x86_64

enabled=1

gpgcheck=1

gpgkey=https://dl.google.com/linux/linux_signing_key.pub

How do I upgrade Google Chrome from an older version?

You can simply update it by typing the following command:

$ sudo yum update google-chrome-stable

 

OR use the following dnf command to update it on a Fedora Linux:

$ sudo dnf update google-chrome-stable

See also

Google Chrome Download Page.

yum command

 

 

]]>
https://wiki.shopingserver.com/install-google-chrome-62-rhel-centos-7-fedora-linux-26-using-yum-command/feed/ 0
Google Compute Engine scp Files on a Linux or Unix or Mac OS X https://wiki.shopingserver.com/google-compute-engine-scp-files-linux-unix-mac-os-x/ https://wiki.shopingserver.com/google-compute-engine-scp-files-linux-unix-mac-os-x/#respond Fri, 05 Jan 2018 14:32:40 +0000 http://wiki.shopingserver.com/?p=18254 S

cp command copies files between hosts on a network using ssh for data transfer. How do I use scp toefacilitate the remote file transfer using Google Compute Engine virtual machines on a Linux, OS X or Unix-like system?

 

You need to use gcloud compute command to copy files between a Google virtual machine instance and your local machine powered by OSX/Linux or Unix-like system.

Syntax

The basic syntax is as follows to copy file from your local system to remote Google VM:

gcloud compute copy-files local-file-name {instance-name-here}:/path/ –zone {zone-name-here}

The basic syntax is as follows to copy file from remote Google VM to your local system

gcloud compute copy-files {instance-name-here}:/remote/path /local/dir/ –zone {zone-name-here}

gcloud compute copy-files {instance-name-here}:/~filename /local/dir/ –zone {zone-name-here}

Examples

To list your Google compute VM, enter:

$ gcloud compute instances list

 

Sample outputs:

Fig.01: Note down the Google instance NAME and ZONE

 

To copy local /etc/hosts file to remote vm called instance-1 hosted in us-central1-b zone, enter:

$ gcloud compute copy-files /etc/hosts vivek@instance-1:~/ –zone us-central1-b

 

To copy local ~/webapp/ directory to remote vm called instance-1 hosted in us-central1-b zone, enter:

$ gcloud compute copy-files ~/webapp/ nginx@instance-1:/var/www/nginx/ –zone us-central1-b

 

To copy remote /foo/ directory from remote vm called instance-1 hosted in us-central1-b zone, enter:

$ gcloud compute copy-files USER@instance-1:/foo/ ~/backups/ –zone us-central1-b

 

To copy remote file ~/db.conf.py from remote vm called instance-1 hosted in us-central1-b zone, enter:

$ gcloud compute copy-files USER@instance-1:~/db.conf.py $HOME/data/ –zone us-central1-b

Options

You can pass the following options:

–dry-run : If provided, prints the command that would be run to standard out instead of executing it.

 

–plain : Suppresses the automatic addition of ssh(1)/scp(1) flags. This flag is useful if you want to take care of authentication yourself or re-enable strict host checking.

 

–ssh-key-file SSH_KEY_FILE : The path to the SSH key file. By default, this is ~/.ssh/google_compute_engine.

 

–zone ZONE : The zone of the instance to copy files to/from. If omitted, you will be prompted to select a zone.

 

 

]]>
https://wiki.shopingserver.com/google-compute-engine-scp-files-linux-unix-mac-os-x/feed/ 0
How to install FreeBSD 11 on Google Cloud Compute https://wiki.shopingserver.com/install-freebsd-11-google-cloud-compute/ https://wiki.shopingserver.com/install-freebsd-11-google-cloud-compute/#respond Thu, 04 Jan 2018 06:34:33 +0000 http://wiki.shopingserver.com/?p=17984 H

ow can I deploy or install FreeBSD version 11.x Unix operating system on the Google cloud engine? Do I need to create my own FreeBSD disk image to start with the Google Cloud Compute?

 

It is true that the Google Compute Engine support for Debian, Ubuntu, RHEL, SUSE, and FreeBSD Unix. However, create an instance page only display handful of operating systems as follows:

Fig.01: On VM creation page only Linux and Windows are available

You can easily create FreeBSD based VM using the following procedure.

Install gcloud SDK on Linux

First, make sure that Python 2.7 is installed on your Linux based desktop system:

$ wget https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-168.0.0-linux-x86_64.tar.gz

$ tar zxvf google-cloud-sdk-168.0.0-linux-x86_64.tar.gz

$./google-cloud-sdk/install.sh

 

Initialize the gcloud SDK:

$ gcloud init

 

In your browser, log in to your Google user account when prompted and click Allow to grant permission to access Google Cloud Platform resources. Verify that it is working:

$ gcloud auth list

Install gcloud SDK on Mac OS X

Type the following commands:

$ wget https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-168.0.0-darwin-x86_64.tar.gz

$ tar -zxvf google-cloud-sdk-168.0.0-darwin-x86_64.tar.gz

$ ./google-cloud-sdk/install.sh

$ gcloud init

$ gcloud auth list

Get list of gcloud compute images

Type the following command:

$ gcloud compute images list –project freebsd-org-cloud-dev –no-standard-images

 

OR

$ gcloud compute images list \

–project freebsd-org-cloud-dev \

–no-standard-images | grep -i freebsd-11

 

Sample outputs:

freebsd-11-0-beta4-amd64                  freebsd-org-cloud-dev                      READY

freebsd-11-0-current-amd64-2015-07-23     freebsd-org-cloud-dev                      READY

freebsd-11-0-current-amd64-2015-08-04     freebsd-org-cloud-dev                      READY

freebsd-11-0-current-amd64-2015-08-19     freebsd-org-cloud-dev                      READY

freebsd-11-0-current-amd64-2015-08-27     freebsd-org-cloud-dev                      READY

freebsd-11-0-current-amd64-2015-09-04     freebsd-org-cloud-dev                      READY

freebsd-11-0-current-amd64-2015-09-18     freebsd-org-cloud-dev                      READY

freebsd-11-0-current-amd64-2015-10-02     freebsd-org-cloud-dev                      READY

….

freebsd-11-1-beta2-amd64                  freebsd-org-cloud-dev                      READY

freebsd-11-1-beta3-amd64                  freebsd-org-cloud-dev                      READY

freebsd-11-1-prerelease-amd64-2017-05-19  freebsd-org-cloud-dev                      READY

freebsd-11-1-prerelease-amd64-2017-05-26  freebsd-org-cloud-dev                      READY

freebsd-11-1-prerelease-amd64-2017-06-02  freebsd-org-cloud-dev                      READY

freebsd-11-1-rc1-amd64                    freebsd-org-cloud-dev                      READY

freebsd-11-1-rc2-amd64                    freebsd-org-cloud-dev                      READY

freebsd-11-1-rc3-amd64                    freebsd-org-cloud-dev                      READY

freebsd-11-1-release-amd64                freebsd-org-cloud-dev                      READY

freebsd-11-1-stable-amd64-2017-07-28      freebsd-org-cloud-dev                      READY

freebsd-11-1-stable-amd64-2017-08-08      freebsd-org-cloud-dev                      READY

freebsd-11-1-stable-amd64-2017-08-15      freebsd-org-cloud-dev                      READY

freebsd-11-1-stable-amd64-2017-08-23      freebsd-org-cloud-dev                      READY

freebsd-11-1-stable-amd64-2017-08-29      freebsd-org-cloud-dev                      READY

How to deploying FreeBSD 11 on Google cloud

The syntax is:

$ gcloud compute instances create {INSTANCE} –image freebsd-11-1-release-amd64 \

–image-project=freebsd-org-cloud-dev

 

You need minimum 22GB disk space. So here is my command to create a VM in us-central1-c zone, n1-standard-1 machine type, and boot disk set to 60GB SSD storage:

$ gcloud compute instances create  nixcraft-freebsd11  \

–zone  us-central1-c  \

–machine-type  n1-standard-1  \

–network  default  –maintenance-policy  MIGRATE  \

–image  freebsd-11-1-release-amd64  –image-project=freebsd-org-cloud-dev \

–boot-disk-size  60  \

–boot-disk-type  pd-ssd

 

Sample outputs:

Fig.02: FreeBSD 11 on Google Compute Engine Deployed

How to ssh into FreeBSD 11 Google cloud compute server?

Simply type the following command:

$ gcloud compute ssh {INSTANCE}

$ gcloud compute ssh {INSTANCE} –zone  us-central1-c

$ gcloud compute ssh nixcraft-freebsd11 \

–zone  us-central1-c  \

–project  mybsdvms-nixcraft-156800

$ ssh -i ~/.ssh/my-gcs user@public-IP

 

Sample session:

Fig.03: SSH into my Google cloud server powered by FreeBSD 11

How do I list my Google compute VMs?

Type the following command from your Unix/Linux desktop:

$ gcloud compute instances list

How do I login as root user?

Simply type the command:

$ sudo -s

#

How do I install a bash shell?

Simply type the following pkg command to install a bash shell on FreeBSD 11, run:

# pkg install bash

 

Sample outputs:

Fig.04: Howto Install BASH in FreeBSD using pkg Command

How do I upgrade my FreeBSD 11.x hosted on Google compute?

Run the following command to update base system:

# freebsd-update fetch

# freebsd-update install

[optional but needed for the FreeBSD kernel and other stuff] ##

# reboot

 

Run the following command to update installed packages to the latest version:

# pkg update

# pkg upgrade

 

Sample outputs:

Updating FreeBSD repository catalogue…

FreeBSD repository is up to date.

All repositories are up to date.

Checking for upgrades (5 candidates): 100%

Processing candidates (5 candidates): 100%

The following 4 package(s) will be affected (of 0 checked):

 

Installed packages to be UPGRADED:

sudo: 1.8.20p2_2 -> 1.8.20p2_3

python27: 2.7.13_6 -> 2.7.13_7

curl: 7.54.1 -> 7.55.1

ca_root_nss: 3.31 -> 3.32

 

Number of packages to be upgraded: 4

 

13 MiB to be downloaded.

 

Proceed with this action? [y/N]: y

[1/4] Fetching sudo-1.8.20p2_3.txz: 100%  891 KiB 912.5kB/s    00:01

[2/4] Fetching python27-2.7.13_7.txz: 100%   10 MiB  10.9MB/s    00:01

 

 

]]>
https://wiki.shopingserver.com/install-freebsd-11-google-cloud-compute/feed/ 0
How to create a new config file in Ansible playbook https://wiki.shopingserver.com/create-new-config-file-ansible-playbook/ https://wiki.shopingserver.com/create-new-config-file-ansible-playbook/#respond Wed, 03 Jan 2018 15:15:44 +0000 http://wiki.shopingserver.com/?p=17940 I

wanted to create a file named /etc/apt/apt.conf.d/1000-force-ipv4-transport with the value set to ‘Acquire::ForceIPv4 true;’ on 20 cloud servers hosted in AWS. I already setup Ansible playbook to automate stuff. How do I create a new file using Ansible playbook? Is it possible to create a complex file with many lines of text (say squid.conf) using Ansible IT automation tool?

 

You can use any one of the following module to create a new file:

copy module – Copies files to remote locations.

template module – Templates a file out to a remote server.

How to write a single line content on remote server and create a file

The syntax is pretty simple:

– copy:

content:  your config line here

dest:  /path/to/file

As per the documentation:

When used instead of ‘src’, sets the contents of a file directly to the specified value. This is for simple values, for anything complex or with formatting please switch to the template module.

So here is a sample create-file.yml file:

# create a file to force apt-get to use IPv4 only

– copy:

content:  Acquire::ForceIPv4 true;

dest:  /etc/apt/apt.conf.d/1000-force-ipv4-transport

Where,

content:  Acquire::ForceIPv4 true;  – Sets the contents of a file directly to the specified string

dest:  /etc/apt/apt.conf.d/1000-force-ipv4-transport  – Set remote absolute path where the file should be copied/created to

backup: yes – Remote absolute path where the file should be copied to.

owner: root – Set the user that should own the file/directory

group: root – Set the group that should own the file/directory

mode: 0600 – Set the file permission using octal numbers

So here is a sample inventory file:

$ cat ~/hosts

[kvmhost]

192.168.2.45 ansible_user=root

192.168.2.46 ansible_user=root

 

Here is my updated create-file.yml for you:


# same config file

  • hosts: kvmhost
  •   remote_user: root
  •   tasks:
  •           # create a new file on each host
  •           – copy:
  •                   content:  Acquire::ForceIPv4 true;
  •                   dest:  /etc/apt/apt.conf.d/1000-force-ipv4-transport
  •                   backup: yes
  •                   owner: root
  •                   group: root
  •                   mode: 0644
  •           # run apt-get too using IPv4
  •           – apt:
  •                   update_cache: yes
  •                   cache_valid_time: 3600
  •                   upgrade: dist
  • You can run it as usual:
  • $ ANSIBLE_HOSTS=~/hosts; ansible-playbook create-file.yml

Sample outputs:

Fig.01: Playbook in action

How to create a blank file in Ansible playbook

The syntax is pretty simple:

file:

path: /usr/local/etc/my.conf

state: touch

owner: root

group: root

mode: 0600

Where,

path: /usr/local/etc/my.conf – Set path to the file being managed

state: touch – Create a file set by path

owner: root – Set the user that should own the file/directory

group: root – Set the group that should own the file/directory

mode: 0600 – Set the file permission using octal numbers

How to create a complex file using ansible

If you need to add multiple lines consider using templates module. Templates are processed by the Jinja2 templating language and a file created out to a remote server.

Step – 1 Create Ansible Playbook called squid.yml

$ cat squid.yml

 

Sample outputs:

# Create squid.conf

  • hosts: kvmhost
  •   remote_user: root
  •   tasks:
  •           – template:
  •                   src: squid.conf.j2
  •                   dest: /etc/squid/squid.conf
  •                   owner: root
  •                   group: root
  •                   mode:  0644
  •                   validate:  /usr/sbin/squid -k check
  •                   backup: yes
  •           – service:
  •                   name: squid
  •                   state: restarted
  • Step – 2 Create Jinja2 template called squid.conf.j2
  • $ cat squid.conf.j2

Sample outputs:

acl mylan src {{ nixcraft_vlan_lan_subnet }}

acl SSL_ports port 443

acl CONNECT method CONNECT

http_access deny !Safe_ports

http_access deny CONNECT !SSL_ports

http_access allow localhost manager

http_access deny manager

http_access allow localhost

http_access allow mylan

http_access deny all

http_port {{ nixcraft_http_port }}

tcp_outgoing_address {{ nixcraft_cloud_server_ip }}

cache_mem {{ nixcraft_memory }} MB

cache_dir diskd /var/spool/squid 1024 16 256 Q1=72 Q2=64

access_log daemon:/var/log/squid/access.log squid

coredump_dir /var/spool/squid

refresh_pattern ^ftp:  1440 20% 10080

refresh_pattern ^gopher: 1440 0% 1440

refresh_pattern -i (/cgi-bin/|\?) 0 0% 0

refresh_pattern (Release|Packages(.gz)*)$      0       20%     2880

refresh_pattern .  0 20% 4320

forwarded_for delete

via off

forwarded_for off

follow_x_forwarded_for deny all

request_header_access X-Forwarded-For deny all

Step – 3 Update inventory file called ~/hosts

Updated your inventory file with variables for your squid.conf.j2 template:

$ cat ~/hosts

my hosts ##

[kvmhost]

192.168.2.45 ansible_user=root

192.168.2.46 ansible_user=root

variables with values ##

[kvmhost:vars]

nixcraft_vlan_lan_subnet=10.8.0.0/24

nixcraft_http_port=10.8.0.1:3128

nixcraft_cloud_server_ip=72.xxx.yyy.zzz

nixcraft_memory=1024

Step 4 – Run your playbook to create complex squid.conf file on remote servers

Type the following command:

$ ANSIBLE_HOSTS=~/hosts; ansible-playbook squid.yml

For more info see copy, template and file module documentation.

 

 

]]>
https://wiki.shopingserver.com/create-new-config-file-ansible-playbook/feed/ 0