Monthly Archives: July 2019

How to configure IPsec/L2TP VPN Clients on Linux

After setting up your own VPN server, follow these steps to configure your devices. In case you are unable to connect, first, check to make sure the VPN credentials were entered correctly.

Commands must be run as root on your VPN client.

To set up the VPN client, first install the following packages:

# For Ubuntu & Debian
apt-get update
apt-get -y install strongswan xl2tpd

# For RHEL/CentOS
yum -y install epel-release
yum --enablerepo=epel -y install strongswan xl2tpd

yum -y install strongswan xl2tpd

Create VPN variables (replace with actual values):

VPN_SERVER_IP=your_vpn_server_ip
VPN_IPSEC_PSK=your_ipsec_pre_shared_key
VPN_USER=your_vpn_username
VPN_PASSWORD=your_vpn_password

Configure strongSwan:

cat > /etc/ipsec.conf <<EOF
# ipsec.conf - strongSwan IPsec configuration file

# basic configuration

config setup
  # strictcrlpolicy=yes
  # uniqueids = no

# Add connections here.

# Sample VPN connections

conn %default
  ikelifetime=60m
  keylife=20m
  rekeymargin=3m
  keyingtries=1
  keyexchange=ikev1
  authby=secret
  ike=aes128-sha1-modp2048!
  esp=aes128-sha1-modp2048!

conn myvpn
  keyexchange=ikev1
  left=%defaultroute
  auto=add
  authby=secret
  type=transport
  leftprotoport=17/1701
  rightprotoport=17/1701
  right=$VPN_SERVER_IP
EOF

cat > /etc/ipsec.secrets <<EOF
: PSK "$VPN_IPSEC_PSK"
EOF

chmod 600 /etc/ipsec.secrets

# For CentOS/RHEL & Fedora ONLY
mv /etc/strongswan/ipsec.conf /etc/strongswan/ipsec.conf.old 2>/dev/null
mv /etc/strongswan/ipsec.secrets /etc/strongswan/ipsec.secrets.old 2>/dev/null
ln -s /etc/ipsec.conf /etc/strongswan/ipsec.conf
ln -s /etc/ipsec.secrets /etc/strongswan/ipsec.secrets

Configure xl2tpd:

cat > /etc/xl2tpd/xl2tpd.conf <<EOF
[lac myvpn]
lns = $VPN_SERVER_IP
ppp debug = yes
pppoptfile = /etc/ppp/options.l2tpd.client
length bit = yes
EOF

cat > /etc/ppp/options.l2tpd.client <<EOF
ipcp-accept-local
ipcp-accept-remote
refuse-eap
require-chap
noccp
noauth
mtu 1280
mru 1280
noipdefault
defaultroute
usepeerdns
connect-delay 5000
name $VPN_USER
password $VPN_PASSWORD
EOF

chmod 600 /etc/ppp/options.l2tpd.client

The VPN client setup is now complete. Follow the steps below to connect.

Note: You must repeat all steps below every time you try to connect to the VPN.

Create xl2tpd control file:

mkdir -p /var/run/xl2tpd
touch /var/run/xl2tpd/l2tp-control

Restart services:

service strongswan restart
service xl2tpd restart

Start the IPsec connection:

# Ubuntu & Debian
ipsec up myvpn

# CentOS/RHEL & Fedora
strongswan up myvpn

Start the L2TP connection:

echo "c myvpn" > /var/run/xl2tpd/l2tp-control

Run ifconfig and check the output. You should now see a new interface ppp0.

Check your existing default route:

ip route

Find this line in the output: default via X.X.X.X .... Write down this gateway IP for use in the two commands below.

Exclude your VPN server’s IP from the new default route (replace with actual value):

route add YOUR_VPN_SERVER_IP gw X.X.X.X

If your VPN client is a remote server, you must also exclude your Local PC’s public IP from the new default route, to prevent your SSH session from being disconnected (replace with actual value):

route add YOUR_LOCAL_PC_PUBLIC_IP gw X.X.X.X

Add a new default route to start routing traffic via the VPN server:

route add default dev ppp0

The VPN connection is now complete. Verify that your traffic is being routed properly:

wget -qO- http://ipv4.icanhazip.com; echo

The above command should return Your VPN Server IP.

To stop routing traffic via the VPN server:

route del default dev ppp0

To disconnect:

# Ubuntu & Debian
echo "d myvpn" > /var/run/xl2tpd/l2tp-control
ipsec down myvpn

# CentOS/RHEL & Fedora
echo "d myvpn" > /var/run/xl2tpd/l2tp-control
strongswan down myvpn

Data Analysis with Pandas & Python

What is Data Analysis?
Data analysis is a process of inspecting, cleansing, transforming, and modeling data with the goal of discovering useful information, informing conclusions, and supporting decision-making. In today’s business world, data analysis plays a role in making decisions more scientific and helping businesses operate more effectively
Python is a great language for doing data analysis, primarily because of the fantastic ecosystem of data-centric Python packages. Pandas is one of those packages providing fast, flexible, and expressive data structures designed to make working with “relational” or “labeled” data both easy and intuitive. It aims to be the fundamental high-level building block for doing practical, real-world data analysis in Python.
In this article, I have used Pandas to know more about doing data analysis.
Mainly pandas have two data structures, series, data frames, and Panel.

Installation
The easiest way to install pandas is to use pip:

pip install pandas

or, Download it from here.

  • pandas Series

pandas series can be used for the one-dimensional labeled array.

import pandas as pd
index_list = ['test1', 'test2', 'test3', 'test4']
a = pd.Series([100, 98.7, 98.4, 97.7],index=index_list)
print(a)
output:
test1    100.0
test2 98.7
test3 98.4
test4 97.7
dtype: float64

Labels can be accessed using index attribute
print(a.index)

Index(['test1', 'test2', 'test3', 'test4'], dtype='object')

You can use array indexing or labels to access data in the series.
You can use array indexing or labels to access data in the series
print(a[1])
print(a[‘test4’])

98.7
97.7

You can also apply mathematical operations on pandas series.
b = a * 2
c = a ** 1.5
print(b)
print(c)

test1 200.0
test2 197.4
test3 196.8
test4 195.4
dtype: float64

test1 1000.000000
test2 980.563513
test3 976.096258
test4 965.699142
dtype: float64

You can even create a series of heterogeneous data.
s = pd.Series([‘test1’, 1.2, 3, ‘test2’], index=[‘test3’, ‘test4’, 2, ‘4.3’])

print(s)

test3   test1
test4   1.2
2       3
4.3     test2
dtype: object
  • pandas DataFrame

pandas DataFrame is a two-dimensional array with heterogeneous data.i.e., data is aligned in a tabular fashion in rows and columns.
Structure
Let us assume that we are creating a data frame with the student’s data.

Name Age Gender Rating
Steve 32 Male 3.45
Lia 28 Female 4.6
Vin 45 Male 3.9
Katie 38 Female 2

You can think of it as an SQL table or a spreadsheet data representation.
The table represents the data of a sales team of an organization with their overall performance rating. The data is represented in rows and columns. Each column represents an attribute and each row represents a person.
The data types of the four columns are as follows −

Column Type
Name String
Age Integer
Gender String
Rating Float

Key Points
• Heterogeneous data
• Size Mutable
• Data Mutable

A pandas DataFrame can be created using the following constructor −
pandas.DataFrame( data, index, columns, dtype, copy)

•  data
data takes various forms like ndarray, series, map, lists, dict, constants and also another DataFrame.
•  index
For the row labels, the Index to be used for the resulting frame is Optional Default np.arrange(n) if no index is passed.
•  columns
For column labels, the optional default syntax is – np.arrange(n). This is only true if no index is passed.
•  dtype
The data type of each column.
•  copy
This command (or whatever it is) is used for copying of data if the default is False.

There are many methods to create DataFrames.
• Lists
• dict
• Series
• Numpy ndarrays
• Another DataFrame

Creating DataFrame from the dictionary of Series
The following method can be used to create DataFrames from a dictionary of pandas series.

import pandas as pd
index_list = ['test1', 'test2', 'test3', 'test4']
a = {"column1": pd.Series([100, 98.7, 98.4, 97.7],index=index_list), "column2": pd.Series([100, 100, 100, 85.4], index=index_list)}
df = pd.DataFrame(a)

print(df)

      column1  column2
test1 100.0    100.0
test2 98.7     100.0
test3 98.4     100.0
test4 97.7     85.4

print(df.index)

Index(['test1', 'test2', 'test3', 'test4'], dtype='object')

print(df.columns)

Index(['column1', 'column2'], dtype='object')

Creating DataFrame from list of dictionaries
l = [{‘orange’: 32, ‘apple’: 42}, {‘banana’: 25, ‘carrot’: 44, ‘apple’: 34}]
df = pd.DataFrame(l, index=[‘test1’, ‘test2’])

print(df)

        apple  banana  carrot  orange
test1     42     NaN     NaN    32.0

test2     34    25.0    44.0     NaN

You might have noticed that we got a DataFrame with NaN values in it. This is because we didn’t the data for that particular row and column.

Creating DataFrame from Text/CSV files
Pandas tool comes in handy when you want to load data from a CSV or a text file. It has built-in functions to do this for use.

df = pd.read_csv(‘happiness.csv’)

Yes, we created a DataFrame from a CSV file. This dataset contains the outcome of the European quality of life survey. This dataset is available here. Now we have stored the DataFrame in df, we want to see what’s inside. First, we will see the size of the DataFrame.

print(df.shape)

(105, 4)

It has 105 Rows and 4 Columns. Instead of printing out all the data, we will see the first 10 rows.
df.head(10)

   Country  Gender  Mean    N=
0      AT    Male   7.3   471
1     NaN  Female   7.3   570
2     NaN    Both   7.3  1041
3      BE    Male   7.8   468
4     NaN  Female   7.8   542
5     NaN    Both   7.8  1010
6      BG    Male   5.8   416
7     NaN  Female   5.8   555
8     NaN    Both   5.8   971
9      CY    Male   7.8   433

There are many more methods to create a DataFrames. But now we will see the basic operation on DataFrames.

Operations on DataFrame
We’ll recall the DataFrame we made earlier.

import pandas as pd
index_list = ['test1', 'test2', 'test3', 'test4']
a = {"column1": pd.Series([100, 98.7, 98.4, 97.7],index=index_list), "column2": pd.Series([100, 100, 100, 85.4], index=index_list)}
df = pd.DataFrame(a)

print(df)

      column1 column2
test1 100.0   100.0
test2 98.7    100.0
test3 98.4    100.0
test4 97.7    85.4

Now we want to create a new row column from current columns. Let’s see how it is done.
df[‘column3’] = (2 * df[‘column1’] + 3 * df[‘column2’])/5

        column1  column2  column3
test1    100.0    100.0   100.00
test2     98.7    100.0    99.48
test3     98.4    100.0    99.36
test4     97.7     85.4    90.32

We have created a new column column3 from column1 and  column2. We’ll create one more using boolean.
df[‘flag’] = df[‘column1’] > 99.5

We can also remove columns.
column3 = df.pop(‘column3’)

print(column3)

test1    100.00
test2     99.48
test3     99.36
test4     90.32
Name: column3, dtype: float64

print(df)

       column1  column2   flag
test1    100.0    100.0   True
test2     98.7    100.0  False
test3     98.4    100.0  False
test4     97.7     85.4  False

Descriptive Statistics using pandas
It’s very easy to view descriptive statistics of a dataset using pandas. We are gonna use, Biomass data collected from this source. Let’s load the data first.

url = ‘https://raw.github.com/vincentarelbundock/Rdatasets/master/csv/DAAG/biomass.csv’
df = pd.read_csv(url)
df.head()

     Unnamed:0  dbh  wood   bark    root   rootsk  branch species     fac26
0          1    90   5528.0  NaN   460.0   NaN      NaN   E. maculata    z
1          2   106   13650.0 NaN  1500.0   665.0    NaN   E. Pilularis   2
2          3   112   11200.0 NaN  1100.0   680.0    NaN   E. Pilularis   2
3          4    34   1000.0  NaN   430.0    40.0    NaN   E. Pilularis   2
4          5   130   NaN     NaN  3000.0  1030.0    NaN   E. maculata    z

We are not interested in the unnamed column. So, let’s delete that first. Then we’ll see the statistics with one line of code.

          dbh        wood      bark        root        rootsk        branch
count 153.000000 133.000000   17.000000   54.000000   53.000000   76.000000
mean  26.352941  1569.045113  513.235294  334.383333  113.802264  54.065789
std   28.273679  4071.380720  632.467542  654.641245  247.224118  65.606369
min   3.000000   3.000000     7.000000    0.300000    0.050000    4.000000
25%   8.000000   29.000000    59.000000   11.500000   2.000000    10.750000
50%   15.000000  162.000000   328.000000  41.000000   11.000000   35.000000
75%   36.000000  1000.000000  667.000000  235.000000  45.000000   77.750000
max   145.000000 25116.000000 1808.000000 3000.000000 1030.000000 371.000000

It’s simple as that. We can see all the statistics. Count, mean, standard deviation and other statistics. Now we are gonna find some other metrics which are not available in the describe() summary.

Mean :
print(df.mean())

dbh         26.352941
wood      1569.045113
bark       513.235294
root       334.383333
rootsk     113.802264
branch      54.065789
dtype: float6

Min and Max
print(df.min())

dbh                      3
wood                     3
bark                     7
root                   0.3
rootsk                0.05
branch                   4
species    Acacia mabellae
dtype: object

print(df.max())

dbh          145
wood       25116
bark        1808
root         3000
rootsk      1030
branch      371
species    Other
dtype: object

Pairwise Correlation
df.corr()

             dbh       wood      bark      root    rootsk    branch
dbh     1.000000   0.905175  0.965413  0.899301  0.934982  0.861660
wood    0.905175   1.000000  0.971700  0.988752  0.967082  0.821731
bark    0.965413   0.971700  1.000000  0.961038  0.971341  0.943383
root    0.899301   0.988752  0.961038  1.000000  0.936935  0.679760
rootsk  0.934982   0.967082  0.971341  0.936935  1.000000  0.621550
branch  0.861660   0.821731  0.943383  0.679760  0.621550  1.000000

Data Cleaning
We need to clean our data. Our data might contain missing values, NaN values, outliers, etc. We may need to remove or replace that data. Otherwise, our data might make any sense.
We can find null values using the following method.

print(df.isnull().any())

dbh        False
wood        True
bark        True
root        True
rootsk      True
branch      True
species    False
fac26       True
dtype: bool

We have to remove these null values. This can be done by the method shown below.

newdf = df.dropna()

print(newdf.shape)

     dbh   wood   bark  root  rootsk   branch        species  fac26
123   27  550.0  105.0  44.0     9.0    59.0   B. myrtifolia     z
124   26  414.0   78.0  38.0    13.0    44.0   B. myrtifolia     z
125    9   42.0    8.0   5.0     1.3     7.0   B. myrtifolia     z
126   12   85.0   13.0  17.0     2.2    16.0   B. myrtifolia     z

print(newdf.shape)

(4, 8)

Pandas .Panel()
A panel is a 3D container of data. The term Panel data is derived from econometrics and is partially responsible for the name pandas − pan(el)-da(ta)-s.
The names for the 3 axes are intended to give some semantic meaning to describing operations involving panel data. They are −
• items − axis 0, each item corresponds to a DataFrame contained inside.
• major_axis − axis 1, it is the index (rows) of each of the DataFrames.
• minor_axis − axis 2, it is the columns of each of the DataFrames.

A Panel can be created using the following constructor −
The parameters of the constructor are as follows −
• data – Data takes various forms like ndarray, series, map, lists, dict, constants and also another DataFrame
• items – axis=0
• major_axis – axis=1
• minor_axis – axis=2
• dtype – the Data type of each column
• copy – Copy data. Default, false

A Panel can be created using multiple ways like −
• From ndarrays
• From dict of DataFrames
• From 3D ndarray

# creating an empty panel
import pandas as pd
import numpy as np
data = np.random.rand(2,4,5)
p = pd.Panel(data)

print(p)

output:
Dimensions: 2 (items) x 4 (major_axis) x 5 (minor_axis)
Items axis: 0 to 1
Major_axis axis: 0 to 3
Minor_axis axis: 0 to 4

Note − Observe the dimensions of the empty panel and the above panel, all the objects are different.

From dict of DataFrame Objects

#creating an empty panel
import pandas as pd
import numpy as np
data = {'Item1' : pd.DataFrame(np.random.randn(4, 3)),
'Item2' : pd.DataFrame(np.random.randn(4, 2))}
p = pd.Panel(data)

print(p)

output:
Dimensions: 2 (items) x 4 (major_axis) x 3 (minor_axis)
Items axis: Item1 to Item2
Major_axis axis: 0 to 3
Minor_axis axis: 0 to 2

Selecting the Data from Panel
Select the data from the panel using −
• Items
• Major_axis
• Minor_axis

Using Items

# creating an empty panel
import pandas as pd
import numpy as np
data = {'Item1' : pd.DataFrame(np.random.randn(4, 3)),
'Item2' : pd.DataFrame(np.random.randn(4, 2))}
p = pd.Panel(data)

print p[‘Item1’]

output:
        0          1          2
0 -0.006795 -1.156193 -0.524367
1 0.025610 1.533741 0.331956
2 1.067671 1.309666 1.304710
3 0.615196 1.348469 -0.410289

We have two items, and we retrieved item1. The result is a DataFrame with 4 rows and 3 columns, which are the Major_axis and Minor_axis dimensions.

Using major_axis
Data can be accessed using the method panel.major_axis(index).

     Item1     Item2
0 0.027133 -1.078773
1 0.115686 -0.253315
2 -0.473201 NaN

Using minor_axis
Data can be accessed using the method panel.minor_axis(index).

import pandas as pd
import numpy as np
data = {'Item1' : pd.DataFrame(np.random.randn(4, 3)),
'Item2' : pd.DataFrame(np.random.randn(4, 2))}
p = pd.Panel(data)

print(p.minor_xs(1))

Item1      Item2
0 0.092727 -1.633860
1 0.333863 -0.568101
2 0.388890 -0.338230
3 -0.618997 -1.01808

 

Fusebill AJAX Transparent Redirect

To facilitate PCI compliant credit card collections Fusebill provides a AJAX Transparent Redirect endpoint which you can use to securely capture customer’s credit cards. If you are adding the first payment method on a customer, it will be set to the default payment method automatically.

This API action is authenticated with a separate Public API Key. If you do not have that key, please contact Fusebill Support. The Public Key can only be used to authenticate the Transparent Redirect action.

Google reCAPTCHA required.

Fusebill leverages reCAPTCHA technology to ensure payment method data captured is provided by a human and to protect against bots and scripting.

We use Google reCAPTCHA V2 in order to accomplish this.
https://developers.google.com/recaptcha/intro
The basic workflow for how this is accomplished is as follows:

  • Using Fusebill’s public site key, the client is presented with a captcha widget.
  • The user then verifies that they are human, starting with a check box. The user may be presented with additional verification steps such as an image recognition task.
  • The captcha widget then verifies with Google that the user is human, and returns a response token.
  • That response token is then sent to Fusebill with the payment method data for our system to validate and verify.
Fusebill Environment
reCAPTCHA Public Site Key

Staging (stg-payments.subscriptionplatform.com)

6LcI_GwUAAAAAJZu0VvB68DdxNxb5ZcBIwAX7RVj

Sandbox and Production (payments.subscriptionplatform.com)

6LfVtGwUAAAAALHn9Ycaig9801f6lrPmouzuKF11

Create Credit Card Payment Method

Field Name
Details
Required
Type

CustomerID

This is the Fusebill customer ID of the customer you wish to add the card to

Yes

Number

PublicAPIKey

This is your public API key.
This is found in fusebill account under Settings > Integrations > Transparent Redirect.

Yes

String

CardNumber

This is the credit card number.

Yes

Number

FirstName

 

The first name of the cardholder.

Yes

String

LastName

The last name of the card holder.

Yes

String

ExpirationMonth

Expiration month on the credit card.

Yes

Number

ExpirationYear

Expiration on the credit card.

Yes

Number

CVV

The credit card verification number.

Yes

Number

recaptcha

Recaptcha token response.

Yes

String

riskToken

WePay Risk token

No+

String

clientIp

Client/Customer IP address

No+

String

email

Customer Email address

No+

String

address1

First line of payment method address.

No*

String

address2

Second line of payment method address.

No*

String

city

City of the payment method

No*

String

stateId

State ID of the Payment method.
These can be found by performing a GET to v1/countries

No*

Number

countryId

Country ID of the payment method.
These can be found by performing a GET to v1/countries

No*

Number

postalZip

PostalZip of the payment method

No*

String

paymentCollectOptions

Object that allows specifying an amount to collect when creating the card.

Only works through Json
{
"collectionAmount": 1.0
}

No

Object

+ Denotes a field required for Fusebill Payments API Risk Fields
* Denotes fields required for AVS and may be required by your account’s Gateway. These fields are also required if using Fusebill Payments accounts as AVS is mandatory.

Notes:- Address information can optionally be captured as well.

Sample Code

<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
	<head>
		<title>AJAX Transparent Redirect</title>
		http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js
		
		
		
		
      var verifyCallback = function(response) {
        document.getElementById("mySubmit").disabled = false;
        $('input#recaptcha_token').val(response);
      };
      var expCallback = function () {
        document.getElementById("mySubmit").disabled = true;
        grecaptcha.reset();
      };
      var onloadCallback = function() {
        grecaptcha.render('exampleWithCallback', {
          'sitekey': '',
          'callback': verifyCallback,
          'expired-callback': expCallback
        });
      };
      
			function AJAXTransparentRedirect() {                                 
				var dataString = 'CustomerId='+ $('input#CustomerId').val() + 
						'&PublicApiKey=' + $('input#PublicApiKey').val() + 
						'&SuccessUri='+$('input#SuccessUri').val() +
						'&FailUri='+$('input#FailUri').val() + 
						'&CardNumber='+$('input#CardNumber').val() + 
						'&FirstName='+$('input#FirstName').val() + 
						'&LastName='+$('input#LastName').val() + 
						'&ExpirationMonth='+$('input#ExpirationMonth').val() + 
						'&ExpirationYear='+$('input#ExpirationYear').val() + 
						'&makeDefault='+$('input#MakeDefault').val() + 
						'&Cvv='+$('input#Cvv').val() +
						'&recaptcha=' + $('input#recaptcha_token').val();
				alert(dataString);

				//Set up the request
				var request = $.ajax({
					type: "POST",
					url: "https://stg-payments.subscriptionplatform.com/api/paymentsv2/",                   
					data: dataString 
				});

				//Set up the callback functions
				request.done(function (msg) {
					//$('#response').append("

success

"); alert("success"); //document.location.replace = ''; }); request.fail(function (jqXHR) { // $('#response').append("

failure

"); alert(parseAndBuildErrorMessage(jqXHR)); //document.location.replace = ''; expCallback(); }); } function htmlEscape(msg) { return document.createElement('span') .appendChild(document.createTextNode(msg)) .parentNode .innerHTML; } function buildErrorMessage (errors) { if (errors.length == 0) { return ""; } var message; if (errors.length > 1) { message = "
    "; for (var i = 0; i " + htmlEscape(errors[i].Value) + ""; } } message += "
"; } else { var internalErrors = errors[0].Value; internalErrors = internalErrors.split("\r\n"); if (internalErrors.length > 1) { message = "
    "; for (var i = 0; i " + htmlEscape(internalErrors[i]) + ""; } message += "
"; } } else { message = htmlEscape(errors[0].Value); } } return message; } function parseAndBuildErrorMessage (xhr) { if (xhr.status >= 500) return "An error occurred, please try again"; else return buildErrorMessage(xhr.responseJSON.Errors); } </head> <body> <form id="allfields"> <fieldset> <div> <label for="CustomerId">Customer Id</label> <input autofocus="autofocus" id="CustomerId" name="CustomerId" type="text" value="your customer id here" /> </div> <div> <label for="PublicApiKey">Public Api Key</label> <input id="PublicApiKey" name="PublicApiKey" type="text" value="your key here" /> </div> <div> <label for="CardNumber">Card number</label> <input id="CardNumber" name="CardNumber" type="text" value="4111111111111111" /> </div> <div> <label for="FirstName">First name</label> <input id="FirstName" name="FirstName" type="text" value="John" /> </div> <div> <label for="LastName">Last name</label> <input id="LastName" name="LastName" type="text" value="Doe" /> </div> <div> <label for="ExpirationMonth">Expiry month</label> <input id="ExpirationMonth" name="ExpirationMonth" type="text" value="12" /> </div> <div> <label for="ExpirationYear">Expiry year</label> <input id="ExpirationYear" name="ExpirationYear" type="text" value="20" /> </div> <div> <label for="Cvv">CVV</label> <input id="Cvv" name="Cvv" type="text" value="123" /> </div> <input id="MakeDefault" name="MakeDefault" type="hidden" value="true" /> <div class="g-recaptcha" id="exampleWithCallback"></div> <input id="recaptcha_token" name="recaptcha_token" type="hidden" /> <fieldset> </form> <input type="button" onCLick="AJAXTransparentRedirect();" value="Submit Card" id="mySubmit" disabled /> <div id="response"/> </body> </html> https://www.google.com/recaptcha/api.js?onload=onloadCallback&render=explicit

Sample Response

{
    "maskedCardNumber" : "************1111",
    "cardType" : "Visa",
    "expirationMonth" : 10,
    "expirationYear" : 23,
    "customerId" : 50975,
    "firstName" : "a",
    "lastName" : "a",
    "address1" : null,
    "address2" : null,
    "countryId" : null,
    "country" : "",
    "stateId" : null,
    "state" : "",
    "city" : null,
    "postalZip" : null,
    "makeDefault" : true,
    "id" : 5933,
    "uri" : null
}

Fusebill Payments

When using Fusebill Payments as your gateway processing account, some additional processing and data is required.

These are the ClientIP and a Risk token.

Additional information is available here.

Fusebill Test Gateways

Available here.

Task Notification Bot for slack with Django

Slack is a great platform for team collaboration not just that it also has one of the best API interfaces to build Chatbots.

In this post, I will walk you through building a minimal Slack Bot with Django backend. The idea is to set up a Slack Bot that will notify event when greeted through a backend.

Before we start let us understand the Slack bots life cycle:

  • If you are new to Slack, It’s a messaging platform focused on team collaboration. Slack lets you create custom applications, including bots (sort of Messaging Platform as a Service). You will run the application back end to process business logic in your own server.
  • To start with, you need to be part of the Slack team and have admin privilege to create a new Slack App. If you are not part of a Slack team you may create one.
  • GIve the name of your company or team.

  • Enter Channel Name.
  • Click on See your channel in slack
  • We will create a Slack App for the Slack team then we will add a Bot User to the app.
  • We will create a Django based backend web application to post the messages into slack.
  • After setting up the Slack App and have the backend ready to notified events.

Create a Slack App

Start by creating a Slack app here, click Create App. Then proceed with app creation, give it a name and select the Slack team.

Then you will be taken to App configuration where you need do following to get our Bot up and running.

  1. Create a Bot User
  2. Install Slack App to your Team

Create a BOT User

On the left pane click on Bot User then choose a user name for the Bot and set “Always Show My Bot as Online” slider to on. Click on Add Bot User to create our shipment bot.

Install Slack App to Team

Now on the left pane click Install App and install the app to your Slack team.

Once installed you will get Bot User OAuth Access Token, note down this token we will need it later while configuring Django app. This token is the passphrase for our Bot to interact with the Slack Team.

Slack Client Credentials

Also, note down the App Credentials from Basic Information on the left pane. These credentials let us talk to Slack API, so every time we send a message to Slack we should send our Client ID(CLIENT_ID) & Client Secret(CLIENT_SECRET) to identify ourselves to Slack. Similarly, we can verify if an incoming message is sent by Slack checking if the Verification Token (VERIFICATION_TOKEN) in the message is the same as the one in App Credentials.

Now we should have four key values with us.

  1. Client ID — SLACK_CLIENT_ID/li>
  2. Client Secret — SLACK_CLIENT_SECRET
  3. Verification Token — SLACK_VERIFICATION_TOKEN
  4. Bot User Token — SLACK_BOT_USER_TOKEN

Environment Setup

Let us create a new virtual environment “venv” for our project with python version 3.6.x and activate the virtual environment.

virtualenv venv –python=python3.6

You need to activate the virtual environment before installation of other dependencies.

source venv/bin/activate

Now let’s install required packages

pip install django
pip install slacker
pip install slacker-log-handler

Create a Django Application

django-admin startproject shipment_portal
cd shipment
django-admin startapp shipment

Configure Django Settings

we need to add our own application shipment as a dependency. Add the line mentioned below in the file slack/settings.py

# slack/settings.py

INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'shipment', # <== add this line
]

Then add following configurations in slack_bot/settings.py with your authentication keys from Slack.

# slack/settings.py

# SLACK API Configurations
# ----------------------------------------------
# use your keys
SLACK_CLIENT_ID = '20xxxxxxxxxx.20xxxxxxxxxx'
SLACK_CLIENT_SECRET = 'd29fe85a95c9xxxxxxxxxxxxxxxxxxxxx'
SLACK_VERIFICATION_TOKEN = 'xpxxxxxxxxxxxxxxxxxxxxxxxxx'
SLACK_BOT_USER_TOKEN = 'xoxb-xxxxxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxx'

Now start the Django development server

python manage.py runserver

Once the server is started it will print something similar to this

Performing system checks…
System check identified no issues (0 silenced).
You have 13 unapplied migration(s). Your project may not work properly until you apply the migrations for app(s): admin, auth, contenttypes, sessions.
Run ‘python manage.py migrate’ to apply them.
July 03, 2017–17:30:32
Django version 1.11.3, using settings ‘shipment_portal.settings’
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.

Ignore the migration warnings and open the URL in your browser.

Create an API endpoint

Now that we have our app server up and running we need to create an endpoint for Slack to send event messages. We will create an API view with Django as follows:

Shipment/view.py

class ShipmentCreate(LoginRequiredMixin, View):

def post(self, request, *args, **kwargs):
try:
user = request.user.id
===============
Your code goes here............
===============
response = {
'status': 200,
'type': '+OK',
'message': 'Shipment created',
}
message = 'Shipment created by' + user
send_notification(message, channel='#general')

except Exception as error:
response = {
'status': 500,
'type': '-ERR',
'message': 'Internal Server Error',
}
return JsonResponse(response, status=response.get('status'))

Configure Django Routes

If you are new to web applications, routing is the way to tell web server which functions to invoke when an URL is hit with a request. When the URL is hit with a request message the corresponding function will be invoked, passing the requester message as a parameter to the function.
Add following lines in shipment/urls.py to tie shipment API class to http://localhost:8000/shipment/

from .views import *
from django.conf.urls import url
urlpatterns = [
 url(r'^shipment/$', Shipment.as_view(), name='shipment'),
 ]

slackapi.py

Functions written in slackapi.py are used to post notification/messages to slack.

import os
from datetime import datetime
from slacker import Slacker
from django.conf import settings
slack = Slacker(settings.SLACK_SECRET_KEY)
TEST_CHANNEL = os.environ.get('TEST_SLACK_CHANNEL', None)
channel_details = {
"#general": {
"username": 'shipment-' + settings.ENVIRON,
"icon_url": None,
"icon_emoji": ":email:"
},
"@Jeenal": {},
}
def post_to_slack(message, channel, username=None, icon_url=None, icon_emoji=None):
try:
channel = TEST_CHANNEL or channel
channel_info = channel_details.get(channel, dict())
slack.chat.post_message(
channel=channel,
text=message,
username=username or channel_info.get("username"),
icon_url=icon_url or ((not icon_emoji) and channel_info.get("icon_url")) or None,
icon_emoji=icon_emoji or ((not icon_url) and channel_info.get("icon_emoji")) or None,
as_user=False
)
except Exception as e:
slack_logger.error('Error post message to slack\n', exc_info=True)

def send_notification(message, channel):
try:
post_to_slack(message, channel)
except Exception as e:
slack_logger.error('Error send notification to slack\n', exc_info=True)

 

 

 

How to specify the source address for all outbound connections

If you have multiple IPs assigned on your Linux pc then there is a chance that you want to use different IPs for some applications than default one. Updating IP routes every time isn’t a good idea and you may mess up.

get bindhack.c

“`

wget ‘https://gist.githubusercontent.com/akhilin/f6660a2f93f64545ff8fcc0d6b23e42a/raw/7bf3f066b74a4b9e3d3768a8affee26da6a3ada6/bindhack.c’ -P /tmp/

“`

compile it

“`

gcc -fPIC -static -shared -o /tmp/bindhack.so /tmp/bindhack.c -lc -ldl

“`

Copy it to library folder

“`

cp /tmp/bindhack.so /usr/lib/ && chmod +x /usr/lib/bindhack.so

“`

Optional (ignore if you have it already )

“`

echo ‘nameserver 8.8.8.8’ >> /etc/resolv.conf

“`

using bindhack

“`

BIND_ADDR=<source ip> LD_PRELOAD=/usr/lib/bindhack.so <command here>

“`

Example

 

you can add below function in your .bashrc to spin it at any time

 

bindhack() {
# Author: Akhil Jalagam
[ $# -lt 2 ] && echo "missing arguments: $0 [bind ip] [command with quotes]"
wget 'https://gist.githubusercontent.com/akhilin/f6660a2f93f64545ff8fcc0d6b23e42a/raw/7bf3f066b74a4b9e3d3768a8affee26da6a3ada6/bindhack.c' -P /tmp/
gcc -fPIC -static -shared -o /tmp/bindhack.so /tmp/bindhack.c -lc -ldl
[ -f /usr/lib/bindhack.so ] || `type -P cp` /tmp/bindhack.so /usr/lib/ && chmod +x /usr/lib/bindhack.so
[ -f /etc/resolv.conf ] && `type -P cp` /etc/resolv.conf /etc/resolv.conf.bak && echo 'nameserver 8.8.8.8' >> /etc/resolv.conf
[ $# -eq 2 ] && BIND_ADDR=$1 LD_PRELOAD=/usr/lib/bindhack.so $2
echo ''
`type -P cp` /etc/resolv.conf.bak /etc/resolv.conf
}

 

take a look at bindhack.c

#include <stdio.h>
#include <stdlib.h>
#include <sys/types.h>
#include <string.h>
#include <dlfcn.h>

#include <arpa/inet.h>

/* 
   This is the address you want to force everything to use. It can be
   overriden at runtime by specifying the BIND_SRC environment 
   variable.
*/
#define SRC_ADDR	"192.168.0.1"

/* 
   LIBC_NAME should be the name of the library that contains the real
   bind() and connect() library calls. On Linux this is libc, but on
   other OS's such as Solaris this would be the socket library
*/
#define LIBC_NAME	"libc.so.6" 

#define YES	1
#define NO	0

int
bind(int sockfd, const struct sockaddr *my_addr, socklen_t addrlen)
{

	struct sockaddr_in src_addr;
	void	*libc;
	int	(*bind_ptr)(int, void *, int);
	int	ret;
	int	passthru;
	char 	*bind_src;

#ifdef DEBUG
	fprintf(stderr, "bind() override called for addr: %s\n", SRC_ADDR);
#endif

	libc = dlopen(LIBC_NAME, RTLD_LAZY);

	if (!libc)
	{
		fprintf(stderr, "Unable to open libc!\n");
		exit(-1);
	}

	*(void **) (&bind_ptr) = dlsym(libc, "bind");

	if (!bind_ptr)
	{
		fprintf(stderr, "Unable to locate bind function in lib\n");
		exit(-1);
	}
	
	passthru = YES;	/* By default, we just call regular bind() */

	if (my_addr==NULL)
	{
		/* If we get a NULL it's because we're being called
		   from the connect() hack */

		passthru = NO;

#ifdef DEBUG
		fprintf(stderr, "bind() Received NULL address.\n");
#endif

	}
	else
	{

		if (my_addr->sa_family == AF_INET)
		{
			struct sockaddr_in	myaddr_in;

			/* If this is an INET socket, then we spring to
			   action! */
			passthru = NO;

			memcpy(&myaddr_in, my_addr, addrlen);

			src_addr.sin_port = myaddr_in.sin_port;


		}
		else
		{
			passthru = YES;
		}

	}

	if (!passthru)
	{

#ifdef DEBUG
		fprintf(stderr, "Proceeding with bind hack\n");
#endif

		src_addr.sin_family = AF_INET;

		bind_src=getenv("BIND_SRC");

		/* If the environment variable BIND_SRC is set, then use
		   that as the source IP to bind instead of the hard-coded
		   SRC_ADDR one.
		*/
		if (bind_src)
		{
			ret = inet_pton(AF_INET, bind_src, &src_addr.sin_addr);
			if (ret<=0)
			{
				/* If the above failed, then try the
				   built in address. */

				inet_pton(AF_INET, SRC_ADDR, 
						&src_addr.sin_addr);
			}
		}
		else
		{
			inet_pton(AF_INET, SRC_ADDR, &src_addr.sin_addr);
		}


	/* Call real bind function */
		ret = (int)(*bind_ptr)(sockfd, 
					(void *)&src_addr, 
					sizeof(src_addr));
	}
	else
	{

#ifdef DEBUG
		fprintf(stderr, "Calling real bind unmolested\n");
#endif

	/* Call real bind function */
		ret = (int)(*bind_ptr)(sockfd, 
					(void *)my_addr, 
					addrlen);

	}
#ifdef DEBUG
	fprintf(stderr, "The real bind function returned: %d\n", ret);
#endif

	/* Clean up */
	dlclose(libc);

	return ret;

}

/* 
	Sometimes (alot of times) programs don't bother to call bind() 
	if they're just making an outgoing connection. To take care of
	these cases, we need to call bind when they call connect 
	instead. And of course, then call connect as well...
*/

int
connect(int  sockfd, const struct sockaddr *serv_addr, socklen_t addrlen)
{
	int	(*connect_ptr)(int, void *, int);
	void	*libc;
	int	ret;

#ifdef DEBUG
	fprintf(stderr, "connect() override called for addr: %s\n", SRC_ADDR);
#endif

	/* Before we call connect, let's call bind() and make sure we're
	   using our preferred source address.
	*/

	ret = bind(sockfd, NULL, 0); /* Our fake bind doesn't really need
					those params */

	libc = dlopen(LIBC_NAME, RTLD_LAZY);

	if (!libc)
	{
		fprintf(stderr, "Unable to open libc!\n");
		exit(-1);
	}

	*(void **) (&connect_ptr) = dlsym(libc, "connect");

	if (!connect_ptr)
	{
		fprintf(stderr, "Unable to locate connect function in lib\n");
		exit(-1);
	}


	/* Call real connect function */
	ret = (int)(*connect_ptr)(sockfd, (void *)serv_addr, addrlen);

	/* Clean up */
	dlclose(libc);

	return ret;	

}