Do you ever wish you could sleep your way through the workday? Well, it might not be quite as simple as that. But, there are some job opportunities out there that could be great for those who love to doze.
There are all kinds of work opportunities available these days. You just have to know where to look. It’s always a great idea, when possible, to find work doing something you enjoy. Being invested in your job can help you stay productive and engaged. But, what if one of the things you love to do most is sleep? You might need to invest a little more than just that passion in order to fulfill all of your responsibilities; but, there are some jobs for people who really enjoy sleeping.
1. SCIENTIFIC RESEARCH SUBJECT
Many hospitals and universities need people to participate in various studies as a scientific research subjects. These studies cover all kinds of ground, of course, but some of the projects definitely involve getting paid to sleep. One sleep study at the University of Colorado is paying as much as $1280 to participants. Do be warned though, sometimes sleep studies examine what happens when people don’t get enough shut-eye. The job of research subject likely won’t be as simple as tucking in and collecting a check.
Create a new Job.
Double-click on the work area to bring up the Job properties window. Use it to
define two named parameters: FOLDER_NAME and FILE_NAME.
Drag a START, a Create a folder, and a Create file entry to the work area and
link them as follows:
Create folder4. Double-click the Create a folder entry. As Folder name, type ${FOLDER_NAME}
Using Job executors
The Job Executor is a PDI step that allows you to execute a Job several times simulating a loop. The executor receives a dataset, and then executes the Job once for each row or a set of rows of the incoming dataset. To understand how this works, we will build a very simple example. The Job that we will execute will have two parameters: a folder and a file. It will create the folder, and then it will create an empty file inside the new folder. Both the name of the folder and the name of the file will be taken from the parameters. The main transformation will execute the Job iteratively for a list of folder and file names.
By default, the Job Executor executes once for every row in your dataset, but there are several possibilities where you can configure in the Row Grouping tab of the configuration window:
You can send groups of N rows, where N is greater than 1
You can pass a group of rows based on the value in a field
You can send groups of rows based on the time the step collects rows before executing the Job
Using variables and named parameters
If the Job has named parameters—as in the example that we built—you provide values for them in the Parameters tab of the Job Executor step. For each named parameter, you can assign the value of a field or a fixed-static-value. In case you execute the Job for a group of rows instead of a single one, the parameters will take the values from the first row of data sent to the Job.
visit for more latest job openings in india :www.jobkatta.com