Skip to main content

Quick text files merging, data preparation

It's very often that in natural language processing, you will have to re-format your data to take as inputs to different systems. In this case, these simple linux commands will help you do it much quicker without having to write a script.

1. Merging two files to one file with two column

Input f1 looks like this:
1
2
3
4
Input f2 looks like this:
a
b
c
d
Output f3 will look like this:
1  a
2  b
3  c
4  d
Command: paste f1 f2 > f3 
The delimiter by default is a tab. You can also define it (for example, separated by a comma) as follows:
paste -d ',' f1 f2 > f3

2.  Create a line number to each line of a text file

Assume that you want to create an index to each line in a text file, i.e. inserting a line number and then a tab before the content of each line:
Input f1:
a
b
c
d
Output f2:
1  a
2  b
3  c
4  d
Command: nl f1 > f2

3. Joining two files with a common field

Input f1:
1  aaa
2  bbb
3  ccc
4  ddd
Input f2:
1  a                              
2  b                              
3  c                              
4  d                              
Output f3 (joining on the first field):
1 aaa a                                
2 bbb b                                
3 ccc c                                
4 ddd d                                
Command: join f1 f2 > f3
We can also use join command to join on different fields, different columns (having to sort them first). Further instructions about join can be found here.

Comments

  1. To get columns (e.g., column 3 and 5) from a text file data.txt, one can use "cut" command as follows:
    cut -d' ' -f3,5 < data.txt
    This command is usually much faster than using shell scripts.

    ReplyDelete

Post a Comment

Popular posts from this blog

Pytorch and Keras cheat sheets

Sigmoid, tanh, ReLU functions. What are they and when to use which?

If you are working on Deep Learning or Machine Learning in general, you have heard of these three functions quite frequently. We know that they can all be used as activation functions in neural networks. But what are these functions and why do people use for example ReLU in this part, sigmoid in another part and so on? Here is a friendly introduction to these functions and a brief explanation of when to use which. Sigmoid function Output from 0 to 1 Exponential computation (hence, slow) Is usually used for binary classification (when output is 0 or 1) Almost never used (e.g., tanh is a better option) Tanh function A rescaled logistic sigmoid function (center at 0) Exponential computation Works better than sigmoid ReLU function (Rectified Linear Unit) and its variants Faster to compute Often used as default for activation function in hidden layers ReLU is a simple model which gives 0 value to all W*x + b < 0. The importance is that it introduces t...

Python Tkinter: Changing background images using key press

Let's write a simple Python application that changes its background image everytime you click on it. Here is a short code that helps you do that: import os, sys import Tkinter import Image, ImageTk def key(event): print "pressed", repr(event.char) event.widget.quit() root = Tkinter.Tk() root.bind_all(' ', key) root.geometry('+%d+%d' % (100,100)) dirlist = os.listdir('.') old_label_image = None for f in dirlist: try: image1 = Image.open(f) root.geometry('%dx%d' % (image1.size[0],image1.size[1])) tkpi = ImageTk.PhotoImage(image1) label_image = Tkinter.Label(root, image=tkpi) label_image.place(x=0,y=0,width=image1.size[0],height=image1.size[1]) root.title(f) if old_label_image is not None: old_label_image.destroy() old_label_image = label_image root.mainloop() # wait until user clicks the window except Exception, e: # Skip a...