Friday, May 31, 2013

Load ios print popover inside the our popover

I had a problem which is ,after opening the file there is a share button and when click the share button popover will be appeared. In this popover there are two button one of the button is print.

Problem is after clicking the print button print popover is showing the different popover reason is print popover is generated by ios only thing we need to input the some parameter which are file path of the print file,etc and print controller is not a UIviewcontroller so it cannot push to the navigation controller.

So iOs gave us to a delegate method it return the which navigation controller is needed to push.

Method is

 #pragma -mark UIPrintInteractionControllerDelegate

- (UIViewController *)printInteractionControllerParentViewController:(UIPrintInteractionController *)printInteractionController
{
return self.firstPopoverController.navigationController;
}

if you override the method you can load your print popover given navigation controller.

Tuesday, May 28, 2013

Identify single tap and double tap in ios

I had faced this kind of issue with gestures which is I have used custom gesture for single tap and double tap gesture is inbuilt but double tap gesture is not visible to others you cannot access this gesture when creating the single tap gesture.
The issue is when i double tap on the page this app get it as this is two single tap So it is running single tap task.

I used following scenario for solving this issue.

1. Create single tap gesture like this

tapGestureRecognizer = [[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(toggleBars)];tapGestureRecognizer.numberOfTouchesRequired = 1;[self.view addGestureRecognizer:tapGestureRecognizer];tapGestureRecognizer.delegate = self;

2.create target function like this

- (void)toggleBars
{
   [self performSelector:@selector(performToggleBars) 
              withObject:nil 
              afterDelay:0.4];
}

- (void)performToggleBars
{
   //task you need to process.
}

3. There is UIGestureRecognizerDelegate delegate method you need to override like this

- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
       shouldReceiveTouch:(UITouch *)touch
{
  if (touch.tapCount != 1)
  {
    [NSObject cancelPreviousPerformRequestsWithTarget:self
                                            selector:@selector(performToggleBars)
                                              object:nil];
  }

  return YES;
}

in this scenario cancel the single tap gesture event if it is multiple taps.

Wednesday, May 22, 2013

Solution for popover resizing issue

If you have two controller which are A and B ,open with UIpopover but size is different and which are loaded to navigation stack as first load A controller and then load B controller but B controller size is greater than A controller.When you pushing controllers to navigation stack the issue is not happen.But after coming back through the navigation controller(pop the controllers in stack) popover size is not changed.its size is taken last pushed controller size(B). This issue is ios because popover is not taken current controller size.

So solution is use A controller viewDidAppear function and forcefully changed the size and then again rechange then your issue is solved.

Example :


- (void)forcePopoverResize
{
    CGSize currentSetSizeForPopover = self.contentSizeForViewInPopover;
    CGSize tempPopoverSize = CGSizeMake(currentSetSizeForPopover.width - 1.0f, currentSetSizeForPopover.height - 1.0f);
    self.contentSizeForViewInPopover = tempPopoverSize;
    self.contentSizeForViewInPopover = currentSetSizeForPopover;
}

You can call this function inside the viewDidAppear

[self  forcePopoverResize];

Now your problem is solved :)

Tuesday, March 26, 2013

GMGridview long press delete and moving cell at same time

GMGridview is very essential library project for cell manipulation for iOS project which are moving cell and delete cell and etc.But I have seen there was a problem you can't add two animation with same time like when moving and delete animation with same time.

So i have changed it as i want ,


Problem is : long-press delete and moving cell same time.

Solution is : when you are long pressing,it active the delete mode then if you want delete cell ,you can delete  it , if you want to moving cell you can move cell but while moving delete mode is deactivate and after releasing cell delete mode also active again.

So i have create sub class of GMGridview and change the rules what you want.But some methods and variables are private So you should move in to GMGridview.h file .

Step 1  : Add properties to GMGridview header file 



And syntheses the this properties and remove private variable these name in GMGridview.h file.

 Step 2  : Add methods to GMGridview header file 

Step 3    : create sub class of GMGridview and changed rules as follow(GMGridViewSub)
Step 3.1 : Header file

















Step 3.2 : Implementation file(m)
Step 3.2.1 : Import header file and constant as follow 

















Step 3.2.2 : Add following init and function within implementation scope as follow

It first call super class function and then run remainder task.Change the valid rule as following code because it check without editing.


Longpress GestureUpdate : Add the moving editing animation as follow and
SortingMoveDidStart :  before start the moving edit mode is NO 





























Change following function Edditing is yes and after stop the animation changed it as

if(self.editing)
     self.sortMovingItem.editing =YES;
















While moving cell editing flag is NO








Now you have developed the sub class of GMGridView and then you can call the subclass as you want instead of GMGridView Add this reference as you wanted class as follow.

GMGridViewSub *gmGridView = [[GMGridViewSub alloc] init];


Friday, September 21, 2012

How to create a Struts,Spring,Hibernate project


Sometime you all know about row J2EE. But when you are developing web based application according to MVC you should need to know about some kind of frameworks because those are so easy for developing and maintaining which are Strut, spring, EJB and Hibernate and etc.

Following diagram show the overall idea about where this frameworks can be used in your project.


 At first,I would like to tell you how to create struts project. 

Strut project

Why we use strut and where we use ? actually Strut is running according to url patterns. It can handle the all the requests and redirect this request according to its rules. Strut is worked as a controller. Probably this request sends to business logic which means EJB finally it know where this request need to go.

Before developing application it needs some library and there are so many struts version but I plan to use 2.06 versions. In this version there is not strut file because all are defined in web XML.
  1.   Create a dynamic web project in your eclipse.
  2.    Add the needed library for this project.
  3.   Change the index.jsp file like this.


This test.action is the url pattern and when this button click this request is going to web.xml and It is defined where the action classes are according to requests.

    4. Add the following code in to web xml.

      In web xml 



If request are coming with “ *.action ” pattern check the filter name in filter mapping and then go to filter again check the filter name and load the action classes to memory finally get the prefix of request pattern and check the functions in equal prefix pattern name class.  

If there are more than one packages of action classes add package with “,” in param value tag.

com.strut.test” is package of action classes.


      5. Create action class in com.strut.test package.




When action classes are created post fix of class name should be “Action” and prefix of class name depend of created request pattern of project.

According to my example this action class name should be “TestAction” This class is extended from “ServletRequestAware” and there are override methods which are
  1. void setServletRequest(HttpServletRequest request)
  2. String execute()

When request are come to this class assigned to request variable and then execute the execute function. In execute function, identify the which one is click and send it to business logic and get the response from that request and redirect to another jsp page or java Servlet.

Return String this execute function and define the String pattern according to string inside the action class where this request needs to redirect.



If there are more return String, you can add another result with “,” inside the Results like this.



Then this request is going to web xml file check the string pattern for showing results.


      6. Add the following codes in side web xml.




Finally check the string pattern “/test” and get the servlet name inside servlet mapping then go to find equal servlet name inside the servlet then find destination.(jsp page ,It may be change according to requirement ).







Sunday, January 29, 2012

How to run apache hadoop cluster program

What is apache hadoop which is project develops open-source software for reliable, scalable, distributed computing.The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using a simple programming model.It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-avaiability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-availabile service on top of a cluster of computers.

When we are setting up the apache hadoop in practically bring up the lots of problems Because there are less correct document in hadoop for some time it depend on using operating system.All of batch members were faced lots of problems for that finally Thilina and I configured apache hadoop cluster setup within 4 hours.So I try to give that knowledge to others because It help for you.This tutorial can be divided two parts which are set ssh connection each nodes and configure the hadoop.

Resources

Ubuntu 11.04,11.10
hadoop-0.20.2.tar.gz
JDK 1.6.0_24

Please doing all the thing according to this order

1.Install the ssh and rsync package to your machine

    $ sudo apt-get install ssh
    $ sudo apt-get install rsync

if it is getting some problem you will update your machine repository like this  and run above code again

    $sudo apt-get update

2 First you need to disable the firewall because some process may be used  the port of machine result of that hard to configure hadoop.

    $sudo ufw disable 

3.create user group using super user  

    $su root
    $sudo groupadd hadoop_user


4.create user Hadoop and assign that user to created user group 

     $sudo useradd --home-dir /home/hadoop --create-home --shell /bin/bash -U hadoop
     $sudo usermod -a -G hadoop_user hadoop 


5.create password for created user account what ever you wan,enter twice your password

    $passwd hadoop


6.Then check whether it is working 

    $su hadoop

7.Restart the machine and login hadoop account

8.Genarate the key pairs to the machine

    $ ssh-keygen -t rsa
 
after enter key press,create pub key and private key in ssh for hadoop account,Do the above orders in master node and slave machine also.



9.write the public key to one place to anoter in same machine

    $cat /home/hadoop/.ssh/id rsa.pub >> /home/hadoop/.ssh/authorized keys

10.copy the master publick key to all slave node

    $scp /home/hadoop/.ssh/id rsa.pub IPADDRESS of slave:/home/hadoop/.ssh/master.pub


11.Then login each slave node from master node and run this(write the master pub key in slave node to authorized_keys of slave node )

    $cat /home/hadoop/.ssh/master.pub >> /home/hadoop/.ssh/authorized_keys

12.check whether now you can login to slave machine without password and to your localhost also

    $ssh IPADDRESS of slave
    $ssh localhost


13.Login to hadoop account and create project folder in hadoop home account

    $mkdir -p /home/hadoop/project

14.install the haddop in to hadoop folder(copy the hadoop-0.20.0.tar.gz in to project folder)
     
    $cd /home/hadoop/project
    $tar -xzvf ./hadoop-0.20.0.tar.gz


15change the environmental variable in .profile file or .bashrc file

    export JAVA HOME=/home/hadoop/jdk1.6.0_24(if you install the jdk externally )
    export HADOOP_HOME=/home/hadoop/project/hadoop-0.20.2
    export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$PATH 


if you didn't install JDK in externally to remove first line of above


16.change the JAVA_HOME environmental variable in hadoop/conf/hadoop-env.sh
     uncoment and change the java home according to java location

    export JAVA_HOME=/home/hadoop/jdk1.6.0_24

17.configure the hadoop parameters in conf.xml in conf folder
change the HADOOP HOME/src/core/core-default.xml.If you use masternode ip address for this.It's ok instead of this use domain name you need to change the hosts file in the /etc/ folders

    <configuration>
        <property>
               <name>fs.default.name</name>
                <value>hdfs://192.168.10.2:9000/</value>
        </property>
        <property>
               <name>hadoop.tmp.dir</name>
               <value>/tmp/hadoop-${user.name}</value>
        </property>

   </configuration>

change the HADOOP HOME/src/core/hdfs-default.xml

      <configuration>
            <property>
                  <name>dfs.replication</name>
                  <value>2</value>
            </property>
            <property>
                <name>dfs.block.size</name>
                <value>128000000</value>
            </property>
            <property>
                <name>dfs.permissions</name>
                <value>true</value>
            </property>
    </configuration>


change the HADOOP HOME/src/core/mapred-default.xml.you can run the job tracker in different machine if you want you need to change only ip address of job tracker

    <configuration>
          <property>
             <name>mapred.job.tracker</name>
             <value>hdfs://192.168.10.2:9001</value>
          </property>
          <property>
             <name>mapred.child.java.opts</name>
             <value>-Xmx512m</value>
          </property>

    </configuration> 

After changing xml files then put the master IP Address in master file in conf folder and also put the slaves IP Address in slave file in conf folder

18.After configure the master node you can copy the hadoop to other machine.

      
    $scp -r /home/hadoop/project IPADDRESS of slave:/home/hadoop/

run above code for each nodes changing ip address of slave

19.Now you need to change the environtal varible in each slave node as above way in .bashrc file or .profle and hadoop/conf/hadoop-env.sh file.

20.Now login to master node as hadoop acount and format the namenode(go to inside bin)

    $hadoop namenode -format


21.run the server

    $start-all.sh

22.Run your jar file in hadoop cluster 

    $hadoop jar /home/hadoop/smscount.jar org.sms.SmsCount /home/hadoop/smscount/input  /home/hadoop/smscount/output 

 Before it is running we need to know how to handle HDFS  folders
                 
     Create folder :  $hadoop dfs -mkdir /home/hadoop/smscount/input
     List file and folder :  $hadoop dfs -ls /home/hadoop/smscount/input
     Remove folder :  $hadoop dfs -rmr /home/hadoop/smscount/input
     put file : $hadoop dfs -put /home/hadoop/ file1 /home/hadoop/smscount/input
          
After running smscount jar automatically create hdfs output folder remember if you need to run this twice remove the output folder because it is throwing the exception.

23.To whatch the result

    $hadoop dfs -cat /home/hadoop/smscount/output/part-00000
 
24.stop the server

    $stop-all.sh


25.if it is running you can check following links and enjoy it.

    Hadoop Distributed File System (HDFS):
        http://IPADDRES of namenode:50070

    Hadoop Jobtracker:
        http://IPADDRES of jobtracker:50030

    Hadoop Tasktracker:
        http://IPADDRES of map-reduce processor:50060