Saturday, 6 April 2019

How Spring MVC flow with interview question and answer

Workflow of a SpringMVC Application:-
...............................................................
1.Whenever client make a request that will be traped by the DispatcherServlet.

2.DispathcerServlet reads the bean configuration file then it takes the support of  HandlerMappings to idetify the Controller classes.

**HandlerMappings are predefined java classes those were given by the Spring..

**We generally configure HandlerMappings within a bean configuration file....

**HandlerMappings containes a information about the controller classes..

**Inorder to process each and every client request we generally maintain seperate controller classes.....

3.Finally DispatcherServlet invokes the Controller class based on the client request with the help of a HandlerMapping..

4.Controller class contains a complete information about a client in the form a CommandObject.....

5.Controller class will interacts with the model calsses ......

6&7.Based on the results those were return by the ModelClasses,Controller class returns a ModelAndView object to the DispatherServelet..

ModelAndView object cotnains a information about model class returned infomation and the presentation page....

8.DispatherServlet takes the support of a ViewResolver to identify the presentation page....

Actually a ModelAndView object may contain a logical inforamtion about the presentation page or It may cotain exact information about a presentaion page....

If the ModelAndView object contains a LogicalInforamation about the presentation page then DispatherServlet takes the support of a ViewResolver ....

If the ModelAndView object contains a ExactInforamation about the presentation page then DispatherServlet directly invokes the presentaion page without taking the support of a ViewResolver ......

=>Generaly we can return ModelAndView by specifiying the "LogicalName" of the jsp page
or
By Specifiying the exact name of the JspPage.....

................***................***................***........................***.............

Understanding the HandlerMapping's:-
..............................................................
=>HandlerMapping's are predefined javaclasses those were given by the Spring...

=>We can configure them as a SpringBean within a bean configuration file....

=> HandlerMapping's containes a information about the Controler classes.....

=>HandlerMapping's can able to idetifie the Controller classes based on the client request....

The predefined HandlerMapping's those were given by the Spring are.......

=>BeanNameUrlHandlerMapping
=>SimpleUrlHandlerMapping
=>CommonsPathMapUrlHandlerMapping
=>ControllerClassNameUrlHandlerMapping




------------------------------------------------------------------------------------------------------------------

Understanding a "ModelAndView" Object:-
....................................................................
=>The org.springframework.web.servlet.ModelAndView object is a predefined java class that was given by the Spring....

=>The ModelAndView object contains a information about the response page and the module class generated result....

=>The ModelAndView object can be created in two ways

**By Specifying the exactname of a ResponsePage...
or
**By Specifying the logicalname of a ResponsePage...

public ModelAndView handleRequest(HttpServletRequest request,HttpServletResponse response)
{
.........
.......
return new ModelAndView("Success.jsp");
}


Case 2:-

public ModelAndView handleRequest(....)
{
.........
.......
List<String> list=new ArrayList<String>();
list.add("Ashok");
list.add("Mahesh");
........
......
return new ModelAndView("Success","listOfNames",list);
}

=>Here the "list" object is internally stored within a Request scope with keyname "listOfNames".....

=>Here we are returning ModelAndView object by specifieng the logical name of the response page....


----------------------------------------------------------------------------------

Understanding ViewResolver:-
.................................................
=>ViewResolver is used to invoke the response page....

=>Whenever a ModelAndView object return's a logical name to the DispatcherServlet,the DispatcherServlet takes the support of ViewResolver to identify the response page.....

=>ViewResolver is an interfce one of the implemetation class of a ViewResolver is InternalResourceViewResolver...

=>We generally configure the InternalResourceViewResolver into a bean configuration file......

<bean id="viewResolver" class="org.springframework.web.servlet.view.InternalResourceViewResolver">
<property name="prefix">
<value>/</value>
</property>
<property name="suffix">
<value>.jsp</value>
</property>
</bean>

=>The InternalResourceViewResolver adds "/" as a prefix and ".jsp" as suffix to logicalname..

Note:-
..........
=>If we are returning a ModelAndView object with a logical name we must and should configure a ViewResolver within a bean configuration file......

return new ModelAndView("Success");


=>If we are returning a ModelAndView object with a Exact jsp name we  should not configure a ViewResolver within a bean configuration file......

return new ModelAndView("Success.jsp");


=>If save the bean configurationfile name otherthan dispatcherservletname-servlet.xml we should configure within a web.xml using <context-param> but if we take the support of a <context-param> we should configure the "ContextLoadListner" class.....

So that we can make this as easy by configuring the bean configuration file as <init-param>(Initialization parameter)...

If we configure as a initialization parameter we no need to configure any explicit listener classes.....

ForEx:-
......

Bean configuration file: login-servlet.xml


<web-app>
<servlet>
<servlet-name>ds</servlet-name>
<servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
<init-param>
<param-name>contextConfigLocation</param-name>
<param-value>/WEB-INF/login-servlet.xml</param-value>
</init-param>
</servlet>
.....
.....
If there are miltiple bean configuration file we can seperate with them using ","

Here the "contextConfigLocation" is a predefined initialization parameter.....



To compile the Above Application

we need to set
=>web.jar
=>web.servlet.jar file   as a class path..
=>servlet-api.jar

-----------------------------------------------------------------------------------------------------------------


Java OOPS Interview Question and Answer

OOPS Concept Interview Question
--------------------------------------------------------------------------------------------------
1)What is oops concept with example?
  a-abstraction
  b-incapsulation
  c-polymorphism
  d-inheritance.

-------------------------------------------------------------------------------------------------
Q-What is abstraction?

Ans:-Abstraction.Means hinding of data.
 Abstraction is a process where you show only “relevant” data and “hide” unnecessary details of an object from the user. For example, when you login to your Amazon account online, you enter your user_id and password and press login, what happens when you press login, how the input data sent to amazon server, how it gets verified is all abstracted away from the you.

Advantages of Abstraction
1. It reduces the complexity of viewing the things.
2. Avoids code duplication and increases reusability.
3. Helps to increase security of an application or program as only important details are provided
         to the user.
Another real life example of Abstraction is ATM Machine;
All are performing operations on the ATM machine like cash withdrawal, money transfer,
retrieve mini-statement…etc. but we can't know internal details about ATM.

Note: Data abstraction can be used to provide security for the data from the unauthorized methods.
Note: In Java language data abstraction can achieve using class.

Example of Abstraction:-

class Customer
{
int account_no;
float balance_Amt;
String name;
int age;
String address;
void balance_inquiry()
{
/* to perform balance inquiry only account number
is required that means remaining properties
are hidden for balance inquiry method */
}
void fund_Transfer()
{
/* To transfer the fund account number and
balance is required and remaining properties
are hidden for fund transfer method */
}
Q:How to achieve Abstraction ?
Ans:-There are two ways to achieve abstraction in java
Abstract class (0 to 100%)
Interface (Achieve 100% abstraction)

----------------------------------------------------------------------------------------------------------
Q-What is interface?
ans:- interface  is contract between client and developer.
note:-"interface contain common method prototype that will implemented by multiple subclasses."

real time senario example:-
suppose i have banking application... in banking application  i have four module.
(1)saving a/c
(2)deposit a/c
(3)couurent a/c
(4)fixed deposit
so there is chance that all this module will be developed by deffrent-2 developer or diffrent-2 company
also there is a chance so what i want...
i want there are some method, some 20 method i have, that 20 methos should be same accors all the module,
saving a/c customer or  developer also  has to override the same methos or other developer also has to override.

"implementation may change but  if they want to check the balance get the balance methos is there
its take account number as a perameter, return type is duble
withdraw method is there, deposit method is there.
so suppose these method prototype are common amoung multiple  sub classe's"

note:- if you want to centralized common method prototype you can write inside the interface
         so interface contain common method prototype that wil be implemented by multiple sub classe's

Example of Interface
interface Person
{
void run();  // abstract method
}
class A implements Person
{
public void run()
{
System.out.println("Run fast");

public static void main(String args[])
 {
 A obj = new A();
 obj.run();
 }
}
Output
Run fast

Multiple Inheritance using interface
Example
interface Developer
{
void disp();
}
interface Manager
{
void show();
}

class Employee implements Developer, Manager
{
public void disp()
{
System.out.println("Hello Good Morning");
}
public void show()
{
System.out.println("How are you ?");
}
public static void main(String args[])
{
Employee obj=new Employee();
obj.disp();
obj.show();
}
}
Output
Hello Good Morning
How are you ?

==========================================================================================================
Q:-Why we use Interface ?
Ans-• It is used to achieve fully abstraction.
By using Interface, you can achieve multiple inheritance in java.
It can be used to achieve loose coupling.

 Properties of Interface:-
It is implicitly abstract. So we no need to use the abstract keyword when declaring an interface.
Each method in an interface is also implicitly abstract, so the abstract keyword is not needed.
Methods in an interface are implicitly public.
All the data members of interface are implicitly public static final.

roles of interface:-
1:Interfaces should contain only abstract methods
2:-nterfaces are implemented by the class using ‘implements‘ keyword.
3:-By default, Every field of an interface is public, static and final,
You can’t change the value of a field once they are initialized. Because they are static and final.
4:-By default, All methods of an interface are public and
----------------------------------------------------------------------------------------------------------
Q:-When we use abstract and when Interface
Ans:-If we do not know about any things about implementation just we have requirement specification then
we should be go for Interface
If we are talking about implementation but not completely (partially implemented) then
we should be go for abstract.

Rules for implementation interface
A class can implement more than one interface at a time.
A class can extend only one class, but implement many interfaces.
An interface can extend another interface, similarly to the way that a class can extend another class.
-------------------------------------------------------------------------------------------------------
Q-What is Abstract class in Java?
 A class that is declared as abstract is known as abstract class.
                           or
abstract class that is having paritial implementation is called abstract class.
now in abstract class you can write abstract methos also concrete method now why to write concrete method.
there are some situation  where multiple suclasses have some method behaviour is common
suppose print statement method is there its responsibility to take the 
account number and print the statement
suppose this method behaviour  is common in all the subclass
so why to write the same method body in all the subclasses
it will give code duplication problem
note:-in future if you want to change all the subclass's you have to modify so better what we can do
we can write that common method behaviour in the abstract class

note:-so we can say abstract class contain common methos behaviour
   and concreate contaion specifig method behaviour

Syntax:
abstract class <class-name>{}

An abstract class is something which is incomplete and you cannot create instance of abstract class.
If you want to use it you need to make it complete or concrete by extending it.
A class is called concrete if it does not contain any abstract method and implements
all abstract method inherited from abstract class or interface
it has implemented or extended.
Why interface have no constructor ?
Because, constructor are used for eliminate the default values by user defined values,
but in case of interface all the data members are public static final that means all are constant
so no need to eliminate these values.

Other reason because constructor is like a method and it is concrete method and
interface does not have concrete method it have only abstract methods
that's why interface have no constructor.
-------------------------------------------------------------------------------------------------------------

Q-What is Incapsulation?

Ans:-Binding of data (variable and method ) in single unit.
Encapsulation can be achieved by: Declaring all the variables in the class as private
and writing public methods in the class to set and get the values of variables

Advantages of Encapsulation:
?  Data Hiding: The user will have no idea about the inner implementation of the class.
   It will not be visible to the user that how the class is storing values in the variables.
   He only knows that we are passing the values to a setter method and variables are getting initialized
   with that value.
   Increased Flexibility: We can make the variables of the class as read-only or write-only
   depending on our requirement. If we wish to make the variables as read-only then
   we have to omit the setter methods like setName(), setAge() etc. from the above program or
   if we wish to make the variables as write-only then we have to omit the get methods like getName(),
   getAge() etc. from the above program

Reusability: Encapsulation also improves the re-usability and easy to change with new requirements.
Testing code is easy: Encapsulated code is easy to test for unit testing.
// Java program to demonstrate encapsulation
public class Encapsulate
{
    // private variables declared
    // these can only be accessed by
    // public methods of class
    private String geekName;
    private int geekRoll;
    private int geekAge;
    // get method for age to access
    // private variable geekAge
    public int getAge()
    {
      return geekAge;
    }

    // get method for name to access
    // private variable geekName
    public String getName()
    {
      return geekName;
    }
   
    // get method for roll to access
    // private variable geekRoll
    public int getRoll()
    {
       return geekRoll;
    }

    // set method for age to access
    // private variable geekage
    public void setAge( int newAge)
    {
      geekAge = newAge;
    }

    // set method for name to access
    // private variable geekName
    public void setName(String newName)
    {
      geekName = newName;
    }
   
    // set method for roll to access
    // private variable geekRoll
    public void setRoll( int newRoll)
    {
      geekRoll = newRoll;
    }
}
In the above program the class EncapsulateDemo is encapsulated as the variables are declared as private. The get methods like getAge() , getName() , getRoll() are set as public, these methods are used to access these variables. The setter methods like setName(), setAge(), setRoll() are also declared as public and are used to set the values of the variables.
The program to access variables of the class EncapsulateDemo is shown below:
public class TestEncapsulation

    public static void main (String[] args)
    {
        Encapsulate obj = new Encapsulate();
       
        // setting values of the variables
        obj.setName("Harsh");
        obj.setAge(19);
        obj.setRoll(51);
       
        // Displaying values of the variables
        System.out.println("Geek's name: " + obj.getName());
        System.out.println("Geek's age: " + obj.getAge());
        System.out.println("Geek's roll: " + obj.getRoll());
       
        // Direct access of geekRoll is not possible
        // due to encapsulation
        // System.out.println("Geek's roll: " + obj.geekName);     
    }
}
Output:
Geek's name: Harsh
Geek's age: 19
Geek's roll: 51

Q:-Encapsulation vs Data Abstraction
Ans:-1. Encapsulation is data hiding(information hiding) while Abstraction is detail hiding(implementation hiding).
2. While encapsulation groups together data and methods that act upon the data, data abstraction deals with exposing the interface to the user and hiding the details of implementation.
3. Encapsulation is not providing full security because we can access private member of the class using reflection API, but in case of Abstraction we can't access static, abstract data member of a class.

---------------------------------------------------------------------------------------------------------

Q-What is polymorphism?
Ans-Ablity to take more than one form that is called polymorphism


Example  of polymorphism;-


import java.util.ArrayList;
import java.util.List;

abstract class Pet{
    public abstract void makeSound();
}

class Cat extends Pet{

    @Override
    public void makeSound() {
        System.out.println("Meow");
    }
}

class Dog extends Pet{

    @Override
    public void makeSound() {
        System.out.println("Woof");
    }

}
Let's test How Polymorphism concept work in Java:
/**
 *
 * Java program to demonstrate What is Polymorphism
 * @author Javin Paul
 */
public class PolymorphismDemo{

    public static void main(String args[]) {
        //Now Pet will show How Polymorphism work in Java
        List<Pet> pets = new ArrayList<Pet>();
        pets.add(new Cat());
        pets.add(new Dog());
 
        //pet variable which is type of Pet behave different based
        //upon whether pet is Cat or Dog
        for(Pet pet : pets){
            pet.makeSound();
        }

    }
}

Output:
Meow
Woof

We can achieve polymorphism in two way?
1-overloading
2-overriding.

--------------------------------------------------------------------------------------------------------
Q- What is overlaoding?
Ans:-If you want to perform one operation having same name of the method is called overlaoding
   another word method nae should be same but signature or parameter should be diffrent

Overloading is an example of compiler-time polymorphism

Advantage of method overloading
Method overloading increases the readability of the program.
s
example:-
public class Sum {

    // Overloaded sum(). This sum takes two int parameters
    public int sum(int x, int y)
    {
        return (x + y);
    }

    // Overloaded sum(). This sum takes three int parameters
    public int sum(int x, int y, int z)
    {
        return (x + y + z);
    }

    // Overloaded sum(). This sum takes two double parameters
    public double sum(double x, double y)
    {
        return (x + y);
    }

    // Driver code
    public static void main(String args[])
    {
        Sum s = new Sum();
        System.out.println(s.sum(10, 20));
        System.out.println(s.sum(10, 20, 30));
        System.out.println(s.sum(10.5, 20.5));
    }
}
---------------------------------------------------
Q) Why Method Overloading is not possible by changing the return type of method only?
In java, method overloading is not possible by changing the return type of the method only because of ambiguity. Let's see how ambiguity may occur:

class Adder{
static int add(int a,int b){return a+b;}
static double add(int a,int b){return a+b;}
}
class TestOverloading3{
public static void main(String[] args){
System.out.println(Adder.add(11,11));//ambiguity
}}
---------------------------------------------------
Can we overload java main() method?
Yes, by method overloading. You can have any number of main methods in a class by method overloading. But JVM calls main() method which receives string array as arguments only. Let's see the simple example:

class TestOverloading4{
public static void main(String[] args){System.out.println("main with String[]");}
public static void main(String args){System.out.println("main with String");}
public static void main(){System.out.println("main without args");}
}
------------------------------------------------------------

Q:- What is overriding?
Ans:-Declaring methiod in subclass which is alredy present in super class

Lets take a simple example to understand this. We have two classes:
A child class B and a parent A . The B class extends A class.
Both the classes have a common method void eat().
 B class is giving its own implementation to the eat() method
or in other words it is overriding the eat() method.

The purpose of Method Overriding is clear here.
Child class wants to give its own implementation so that when it calls this method, it prints B is eating instead of A is eating.


The main advantage of method overriding is that the class can give its own specific implementation
to a inherited method without even modifying the parent class code.


Method Overriding is an example of runtime polymorphism.
When a parent class reference points to the child class object then the call to the overridden method is determined at runtime,



class ABC{
   //Overridden method
   public void disp()
   {
System.out.println("disp() method of parent class");
   }  
}
class Demo extends ABC{
   //Overriding method
   public void disp(){
System.out.println("disp() method of Child class");
   }
   public void newMethod(){
System.out.println("new method of child class");
   }
   public static void main( String args[]) {
/* When Parent class reference refers to the parent class object
* then in this case overridden method (the method of parent class)
*  is called.
*/
ABC obj = new ABC();
obj.disp();

/* When parent class reference refers to the child class object
* then the overriding method (method of child class) is called.
* This is called dynamic method dispatch and runtime polymorphism
*/
ABC obj2 = new Demo();
obj2.disp();
   }
}
Output:

disp() method of parent class
disp() method of Child class


Rules for method overriding:
1-In java, a method can only be written in Subclass, not in same class.
2-The argument list should be exactly the same as that of the overridden method.
3-The return type should be the same or a subtype of the return type declared
in the original overridden method in the super class.

4-The access level cannot be more restrictive than the overridden method’s access level.
5-For example: if the super class method is declared public then the over-ridding
method in the sub class cannot be either private or protected.

6-Instance methods can be overridden only if they are inherited by the subclass.
7-A method declared final cannot be overridden.
8-A method declared static cannot be overridden but can be re-declared.
9-If a method cannot be inherited then it cannot be overridden.
10-A subclass within the same package as the instance’s
superclass can override any superclass method that is not declared private or final.

11-A subclass in a different package can only override the non-final methods declared public or protected.
12-An overriding method can throw any uncheck exceptions, regardless of whether
the overridden method throws exceptions or not.

13-However the overriding method should not throw checked exceptions that are new or broader
than the ones declared by the overridden method. The overriding method can throw narrower or
fewer exceptions than the overridden method.

14-Constructors cannot be overridden.
-------------------------------------------------------------------------------------------------------

Q:-Overriding vs Overloading :
Ans:-
1-Overloading is about same method have different signatures. Overriding is about same method,
  same signature but different classes connected through inheritance.

2-Overloading is an example of compiler-time polymorphism and overriding is an
  example of run time polymorphism

-------------------------------------------------------------------------------------------------------
Q:- What is Inheritance?
Ans:-It is the mechanism which one class is allow to inherit the features(fields and methods) of another class.

-------------------------------------------------------------------------------------------------------
Q:-Why use Inheritance ?
For Method Overriding (used for Runtime Polymorphism).
It's main uses are to enable polymorphism and to be able to reuse code for different classes
         by putting it in a common super class
For code Re-usability


Advantage of inheritance
If we develop any application using concept of Inheritance than that application have following advantages,
Application development time is less.
Application take less memory.
Application execution time is less.
Application performance is enhance (improved).
Redundancy (repetition) of the code is reduced or minimized so that
we get consistence results and less storage cost.
-------------------------------------------------------------------------------------------------------
Q:-Can we overload a static method in Java? (answer)

Ans-Yes, you can overload a static method in Java. You can declare as many static methods of the same name.
------------------------------------------------------------------------------------------------------
Q:-Can we override static method in Java? (answer)
Ans-No, you cannot override a static method because it's not bounded to an object. Instead,
static methods belong to a class and resolved at compile time using the type of reference variable

Can a class extend more than one class in Java?
No, a class can only extend another class because Java doesn't support multiple inheritances but yes,
 it can implement multiple interfaces.
-------------------------------------------------------------------------------------------------
Q:-Can we make a class both final and abstract at the same time? (answer)

ans-no they are exactly opposite of each other. A final class in Java cannot be extended and

-------------------------------------------------------------------------------------------------
Q:Can an interface extend more than one interface in Java?
ans-Yes, an interface can extend more than one interface in Java, it's perfectly valid.

you cannot use an abstract class without extending and making it a concrete class.

==================================================================================================
Q:- How many Type of Exception?
Ans;-
1)Checked Exception
2)Un-Checked Exception

1:-Checked Exception
Checked Exception-"if you code inside the method or constructor throwing some exception and
if that exception is checked by compiler, compiler is forcing you to report about the exception
by writting by try and catch block or by using throws keyword is called checked exception".
 are the exception which checked at compile-time.
These exception are directly sub-class of java.lang.Exception class.
Only for remember: Checked means checked by compiler so checked exception are checked at compile-time.

Example:- classNotFound,fileNotFound,

2:-Un-Checked Exception
Un-Checked Exception-means this is aslo called runtime Exception,
if some code inside the method throwing exception and compiler is not telling anything
wheather exception you want to report or dont report, compiler is not informing anything is called
unchecked exception.are the exception both identifies or raised at run time.
These exception are directly sub-class of java.lang.RuntimeException class.

Note: In real time application mostly we can handle un-checked exception.
Only for remember: Un-checked means not checked by compiler so un-checked exception are checked at run-time
not compile time.

Example:- nullPointerException, arrayOutOfBoundxception

==========================================================================================================

Q:-Difference between checked Exception and un-checked Exception
Checked Exception                            Un-Checked Exception
1- checked Exception are checked at compile time    un-checked Exception are checked at run time


OOPS Interview Question and Answer
==========================================================================================================
Q:-How to create User define Exception or Custom Exception in Java?
Ans:-

If any exception is design by the user known as user defined or Custom Exception.
Custom Exception is created by user.
Rules to design user defined Exception
1. Create a package with valid user defined name.
2. Create any user defined class.
3. Make that user defined class as derived class of Exception or RuntimeException class.
4. Declare parametrized constructor with string variable.
5. call super class constructor by passing string variable within the derived class constructor.
6. Save the program with public class name.java


3
        e.g:-FileNotFoundException,            e.g.:-NumberNotFoundException etc.
             ArithmeticException,                 NullPointerException, ArrayIndexOutOfBoundsException etc.



Top 20 Spring Boot interview Question and answer

Q:What is spring boot?
ans:spring boot is best java framwork for microservices,

Q:-spring boot vs Spring MVC vs Spring - How do they compare.
Ans:
Most important feature of spring framwork.
dependency injection. At the core of all spring module is dependency
injection or IOC inversion of control.

So Spring is dependency injection,
its makes the code lossly coupled.

Spring MVC:-
Spring mvc framwork provide decoupled way of developing web application with
simple concpt like DispatcherServlet,ModelAndView Resolver
it make it easy to develop web application



Spring boot:- The problem with Spring framwork and Spring Mvc is
The amount of configruation that is needed.

Like...With spring spring MVC you need to configure data source , viewResolver webjar and
lot of stuff you need to confugure.



Spring boot says ok i'll look at it what are the jar availble in the class path,
and I will configure every thing imedaitly
thats what spring Boot enables

Spring Boot solve this problem through a combination of Auto Configuration
and starter project, Spring Boot also provide a few non functional feature
it help making the monitoring appliaction very very easy.

Q:What is Auto Configruation?
Ans:-Auto configuration is feature of spring boot where it look at a framworks
on the CLASSPATH and it looks Existing configuration for the application.

its decide what configuration is needed what can be automatically configured.
Spring Boot provide basic configuration needed to configure the application with these framwork this is called Auto configuration./


to make building production ready application faster.


Example:- if ti see JPA jar on CLASSPATH it automatically cofigured entity-manager
it automatically configured data sources

2nd:- if it see spring MVC jar on CLASSPATH
it would automatically configured DispatcherSerlet ,
it will auto matically configyred error-page and error response


Q:- What are Spring Boot Starter Project?
An:-when ever u developed project with spring or spring mvc or Hibernate
you need to add lot of dependency , you need to manage their version n lot of stuff you need to configured

" SO Spring-boot-Starter-web project help you an avoiding  all that"

 if you add simple starter jar dependency in pom.xml  application
like..

<dependency>
     <groupdId>org.springframwork.boot<groupId/>
      <artifactId>spring-boot-starter-web<artifact/>
</dependency>

it will bring lot of jar alog with all spring related jar


Q: What are the other Starter Project Option that Spring Boot Provide?

Ans:Spring boot provide five varetiy of sarter

spring-boot-starter-web-services -- SOAP web Service
spring-boot-starter-web-Web& RestFul application
spring-boot-starter-data-jpa --Spring Data JpA with Hibernate
spring-boot-starter-test --Unit testing and Integration Testing
spring-boot-starter-data-rest -Expose Simple REST Services using Spring Data REST

spring-boot-starter-jdbc - Tradinal JDBC
spring-boot-starter-security - Authentication and Authorization using spring security



Q:- How does spring boot enable creating production reaady application in quick time?

Ans:-spring-boot aims to enable production ready application in quick time
Spring boot provide few non-functional feature out of box,
like caching, logging,monitoring and embebed server.

spring-boot-starter-actuator  - To use advanced feature like monitoring & tracing to your application out of box.
spring-boot-starter-logging
spring-boot-starter-cache - enabling Spring Framork's caching support.


Q:- What is the minimum baseline Java Version for Spring Boot 2 and Spring 5?
Ans:spring boot 2.0 requires java 8
java 7 java 7 are no longer supported.


Q:- What is the easieest approach to create a Spring Boot Project?
Ans:-
Spring Initializer go to http://start.spring.io/
and choose the dependency enter the groupId and ArtifactId
and click Genearte project.

Q:is Spring Initilallzer the only way to create spring boot Project?
Ans:-NO
Spring boot makes it easy to create Spring Boot Project.
But you can setup a maven project and add the right dependencies to start off.

We have two way to create spring boot project
firts one is sart.spring.io
other one- setting up porject manually is used in the section titled
"Basic web application"
like... Go to exlipse, use File-> New Maven project to create a new project
        Add dependencies
        Add the maven plugin
        Add th spring Boot Application class

Q: Why do we need spring-sping-boot-maven-plugin.?
Ans:-spring-boot-maven-plugin provides a few command which enable to package
the code as a jar or run the application

spring-boot-maven:run runs your Spring Boot application
spring-boot-repackage repackage your jar/war to be executable.
spring-boot:start and spring-boot:stop to manage the lifecyc of your Spring
boot application ()i.e.. for integration tests)

spring-boot:build-info generates build information that can be used the Actuator.

Q: How can I enable auto reload of my application with spring Boot?
Ans:-Using Spring Boot Developer Tools dependecy jar.
just add this dependency in pom.xml  and restart the application
spring-boot-devtools 

Q:-What and why Embedded Servers?
Ans:-what you would need to be deploy your pplication
on virtual machine, you need to install...

Step:- install java
step2:- install the web/Application server(Tomcat/weblogic)
step3:-deploy the application war


Embedded server is when our deployable unit contains the binaries for the server(tomcat.jar)


Q:- How can i add custom JS code with Spring boot?
Ans:- create folder called static under resource folder.
you can put your static content in that folder.
path.. resource\static\js\app.js

<script src="/js/app.js"></script?


Q: What is spring data REST?
Ans:-spring data RESt can be use to expose HATEOAS RESTFUL resource
around Spring Data respositorries.

@RepositoryRestResource(collectionSourceRel = "todos", path ="todos")
public interface TodoRepository extends PagingAnd SortingRepository<Todo,Long>{

without writtng a lot of code we can expose RESTFUL API around DATA
repositories.
========================================================================

Q:- What is Spring boot?

Spring Boot is a Spring framework module
 which provides RAD (Rapid Application Development) feature to the Spring framework.
It is highly dependent on the starter templates feature which is very powerful and works flawlessly.

1. What is starter template?
Spring Boot starters are templates that contain a collection of all the relevant transitive dependencies.

For example, If you want to create a Spring WebMVC application then in a traditional setup,
you would have included all required dependencies yourself

With String boot, to create MVC application all you need to import is spring-boot-starter-web dependency.


2. Spring boot autoconfiguration
Autoconfiguration is enabled with @EnableAutoConfiguration annotation.
 Spring boot auto configuration scans the classpath, finds the libraries in the classpath

Spring boot auto-configuration logic is implemented in spring-boot-autoconfigure.jar.



For example, look at auto-configuration for Spring AOP. It does the followings-

Scan classpath to see if EnableAspectJAutoProxy, Aspect, Advice and AnnotatedElement classes are present.
If classes are not present, no autoconfiguration will be made for Spring AOP.
If classes are found then AOP is configured with Java config annotation @EnableAspectJAutoProxy.
It checks for property spring.aop which value can be true or false.
Based on the value of property, proxyTargetClass attribute is set.
AopAutoConfiguration.java
@Configuration
@ConditionalOnClass({ EnableAspectJAutoProxy.class, Aspect.class, Advice.class,
        AnnotatedElement.class })
@ConditionalOnProperty(prefix = "spring.aop", name = "auto", havingValue = "true", matchIfMissing = true)
public class AopAutoConfiguration
{

    @Configuration
    @EnableAspectJAutoProxy(proxyTargetClass = false)
    @ConditionalOnProperty(prefix = "spring.aop", name = "proxy-target-class", havingValue = "false", matchIfMissing = false)
    public static class JdkDynamicAutoProxyConfiguration {

    }

    @Configuration
    @EnableAspectJAutoProxy(proxyTargetClass = true)
    @ConditionalOnProperty(prefix = "spring.aop", name = "proxy-target-class", havingValue = "true", matchIfMissing = true)
    public static class CglibAutoProxyConfiguration {

    }

}



3. Embedded server
Spring boot applications always include tomcat as embedded server dependency.
It means you can run the Spring boot applications from the command prompt without needling complex server infrastructure.

You can exclude tomcat and include any other embedded server if you want.

For example, below configuration exclude tomcat and include jetty as embedded server.

pom.xml
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
    <exclusions>
        <exclusion>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-tomcat</artifactId>
        </exclusion>
    </exclusions>
</dependency>

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-jetty</artifactId>
</dependency>


Q:- To run application which annotation is required?
Ans:-
To run the application, we need to use @SpringBootApplication annotation. Behind the scenes,
that’s equivalent to @Configuration, @EnableAutoConfiguration, and @ComponentScan together.

MyApplication.java
@SpringBootApplication
public class MyApplication
{
    public static void main(String[] args)
    {
        SpringApplication.run(Application.class, args);
    }
}

------------------------------------------------
To execute the application, you can run the main() method from IDE such eclipse, or you can build the jar file and execute from command prompt.

Console
$ java -jar spring-boot-demo.jar
------------------------------------------------

5. Advantages of Spring boot
Ans:
1-Spring boot helps in resolving dependency conflict. It identifies required dependencies and import them for you.
2-It has information of compitable version for all dependencies. It minimizes the runtime classloader issues.


3-It helps in avoiding boilerplate code, annotations and XML configurations.
4-It provides embedded HTTP server Tomcat so that you can develop and test quickly.
5-It has excellent integration with IDEs like eclipse and intelliJ idea.



This annotation is a shortcut of applying 3 annotations in one statement –

@SpringBootConfiguration
@SpringBootConfiguration is new annotation in Spring boot 2. Previously, we have been using @Configuration annotation. You can use @Configuration in place of this.
Both are same thing.


@EnableAutoConfiguration
This annotation is used to enable auto-configuration of the Spring Application Context, attempting to guess and configure beans that you are likely to need. Auto-configuration classes are usually applied based on your classpath and what beans you have defined.

@EnableAutoConfiguration(excludeName = {"multipartResolver","mbeanServer"})
Auto-configuration is always applied after user-defined beans have been registered.

@ComponentScan
This annotation provides support parallel with Spring XML’s context:component-scan element.

Either basePackageClasses() or basePackages() may be specified to define specific packages to scan. If specific packages are not defined, scanning will occur from the package of the class that declares this annotation.


Spring-boot-starter-parent Example
Boot, Spring Boot 2

 spring-boot-starter-parent dependency which is used internally by all spring boot dependencies

What is spring-boot-starter-parent dependency?
The spring-boot-starter-parent dependency is the parent POM providing dependency and plugin management for Spring Boot-based applications. It contains the default versions of Java to use, the default versions of dependencies that Spring Boot uses, and the default configuration of the Maven plugins.



Top 20 Spring Rest webservices Interview Question and Answer

What does REST stand for?
(answer)
REST stands for REpresentational State Transfer,
which uses HTTP protocol to send data from client to server
e.g. a book in the server can be delivered to the client using JSON or XML.


What is a resource?
(answer)
A resource is how data is represented in REST architecture.
By exposing entities as the resource it allows a client to read, write, modify,
and create resources using HTTP methods e.g. GET, POST, PUT, DELETE etc.


What are safe REST operations?
(answer)
REST API uses HTTP methods to perform operations.
Some of the HTTP operations which doesn't modify the resource at the server is known as safe operations
e.g. GET and HEAD. On the other hand, PUT, POST, and DELETE are unsafe because they modify the resource on the server.


What are idempotent operations? Why is idempotency important?
(answer)
There are some HTTP methods e.g. GET which produce same response no matter how many times you use them
 e.g. sending multiple GET request to the same URI will result in same response without any side-effect
hence it is known as idempotent.

On the other hand, the POST is not idempotent because if you send multiple POST request,
 it will result in multiple resource creation on the server, but again, PUT is idempotent
if you are using it to update the resource.

Even, multiple PUT request to update a resource on a server will give same end result.
 You can further take HTTP Fundamentals course by Pluralsight
to learn more about idempotent methods of HTTP protocol and HTTP in general.


Is REST scalable and/or interoperable?
(answer)
Yes, REST is Scalable and interoperable. It doesn't mandate a specific choice of technology
either at client or server end. You can use Java, C++, Python or JavaScript
to create RESTful Web Services and Consume them at the client end.
 I suggest you read a good book on REST API e.g. RESTful Web Services to learn more about REST.


What are the advantages of the RestTemplate?
(answer)
The RestTemplate class is an implementation of Template method pattern in Spring framework.
Similar to other popular template classes e.g. JdbcTemplate or JmsTempalte,
it also simplifies the interaction with RESTful Web Services on the client side.
You can use it to consume a RESTful Web Servicer very easily as shown in this example.


Which HTTP methods does REST use?
(answer)
REST can use any HTTP methods but the most popular ones are GET for retrieving a resource,
POST for creating a resource, PUt for updating resource and DELETE for removing a resource from the server.



What is an HttpMessageConverter in Spring REST?
(answer)
An HttpMessageConverter is a Strategy interface
that specifies a converter that can convert from and to HTTP requests and responses.
Spring REST uses this interface to convert HTTP response to various formats e.g. JSON or XML.

Each HttpMessageConverter implementation has one or several MIME Types associated with it.
Spring uses the "Accept" header to determine the content type client is expecting.

It will then try to find a registered HTTPMessageConverter
that is capable of handling that specific content-type and
use it to convert the response into that format before sending to the client.



How to create a custom implementation of HttpMessageConverter
to support a new type of request/responses?
(answer)
You just need to create an implementation of AbstractHttpMessageConverter and register
it using the WebMvcConfigurerAdapter#extendMessageConverters() method with the classes
which generate a new type of request/response.


Is REST normally stateless?
 (answer)
Yes, REST API should be stateless because it is based on HTTP which is also stateless.
A Request in REST API should contain all the details required it to process i.e.
 it should not rely on previous or next request or some data maintained at the server end e.g.
Sessions. REST specification put a constraint to make it stateless and
 you should keep that in mind while designing your REST API.


What does @RequestMapping annotation do?
(answer)
The @RequestMapping annotation is used to map web requests to Spring Controller methods.
 You can map request based upon HTTP methods  e.g. GET and POST and various other parameters.
For examples, if you are developing RESTful Web Service using Spring then you can use produces
and consumes property along with media type annotation to indicate that
this method is only used to produce or consumers JSON as shown below:


@RequestMapping (method = RequestMethod.POST, consumes="application/json")
public Book save(@RequestBody Book aBook) {
   return bookRepository.save(aBook);
}
You can similarly create other handler methods to produce JSON or XML.
If you are not familiar with these annotations then
I suggest you join Spring MVC For Beginners course on Udemy to learn from scratch.



Is @Controller a stereotype? Is @RestController a stereotype?
(answer)
Yes, both @Controller and @RestController are stereotypes.
The @Controller is actually a specialization of Spring's @Component stereotype annotation.
This means that class annotated with @Controller will also be automatically be detected by Spring container
as part of container's component scanning process.

And, @RestController is a specialization of @Controller for RESTful web service.
It not only combines @ResponseBody and @Controller annotation but also gives more meaning
to your controller class to clearly indicate that it deals with RESTful requests.

Spring Framework may also use this annotation to provide some more useful features related
to REST API development in future.


What is the difference between @Controller and @RestController?
 (answer)
There are many differences between @Controller and @RestController
 as discussed in my earlier article (see the answer)
but the most important one is that with @RestControlleryou get the @ResponseBody annotation automatically,
which means you don't need to separately annotate your handler methods with @ResponseBody annotation.
 This makes the development of RESTful web service easier using Spring. You can see here to learn





When do you need @ResponseBody annotation in Spring MVC?
(answer)
The @ResponseBody annotation can be put on a method to indicates
that the return type should be written directly to the HTTP response body (and not placed in a Model,
 or interpreted as a view name).

For example:

@RequestMapping(path = "/hello", method = RequestMethod.PUT)
@ResponseBody
public String helloWorld() {
   return "Hello World";
}

Alternatively, you can also use @RestController annotation instead of @Controllerannotation.
 This will remove the need for using @ResponseBody because as discussed in the previous answer,
it comes automatically with @RestController annotation.



What does @PathVariable do in Spring MVC? Why it's useful in REST with Spring?
(answer)
It's one of the useful annotations from Spring MVC which allows you
to read values from URI like query parameter.
It's particularly useful in case of creating RESTful web service using Spring
because in REST resource identifiers are part of URI.
This questions is normally asked to experienced Spring MVC developers e.g. 4 to 6 years of experience.

For example, in the URL http://myapp.com/books/101
if you want to extract 101 the id, then you can use @PathVariable annotation of Spring MVC. 
If you are not familiar with Spring MVC annotations then Spring MVC For Beginners:
Build Java Web App in 25 Steps is a good place to start with.





What is the HTTP status return code for a successful DELETE statement?
(answer)
There is no strict rule with respect to what status code your REST API should return after a successful DELETE
i.e it can return 200 Ok or 204 No Content. In general,
 if the DELETE operation is successful and the response body is empty return 204.
 If the DELETE request is successful and the response body is NOT empty, return 200


What does CRUD mean?
(answer)
CRUD is a short form of Create, Read, Update and Delete.
In REST API, the POST is used to create a resource,
 GET is used to read a resource,
PUT is used to updated a resource and DELETE is used to remove a resource from the server.
This one is another beginner level Spring MVC questions for 1 to 3 years experienced programmers


Where do you need @EnableWebMVC?
 (answer)
The @EnableWebMvc annotation is required to enable Spring MVC
when Java configuration is used to configure Spring MVC instead of XML.
It is equivalent to <mvc: annotation-driven>  in XML configuration.

It enables support for @Controller-annotated classes that use @RequestMapping
to map incoming requests to handler methods not already familiar with
Spring's support for Java configuration, Spring Master Class on Udemy is a good place to start.


Q:When do you need @ResponseStatus annotation in Spring MVC?
(answer)
A good questions for 3 to 5 years experienced spring developers.
 The @ResponseStatus annotation is required during error handling in Spring MVC and REST.
Normally when an error or exception is thrown at server side,
web server return a blanket HTTP status code 500 - Internal server error.

This may work for a human user but not for REST clients. You need to send them proper status code
e.g. 404 if the resource is not found. That's where
you can use @ResponseStatusannotation, which allows
you to send custom HTTP status code along with proper error message in case of Exception.

In order to use it, you can create custom exceptions and annotated them using
@ResponseStatus annotation and proper HTTP status code and reason.

When such exceptions are thrown from controller's handler methods and not handled anywhere else,
then appropriate HTTP response with the proper HTTP status code, which you have set is sent to the client.

For example, if you are writing a RESTful Web Service for a library
which provides book information then you can use @ResponseStatus to create Exception
 which returns HTTP response code 404 when a book is not found instead of Internal Server Error (500),
as shown below:

 @ResponseStatus(value=HttpStatus.NOT_FOUND, reason="No such Book")  // 404
 public class BookNotFoundException extends RuntimeException {
     // ...
 }

If this Exception is thrown from any handler method then
HTTP error code 404 with reason "No such Book" will be returned to the client.


Is REST secure? What can you do to secure it?
(answer)
This question is mostly asked with experienced Java programmers
e.g. 2 to 5 years experience with both REST and Spring. Security is a broad term,
it could mean security of message which is provided by encryption or access restriction
 which is provided using authentication and authorization. REST is normally not secure
but you can secure it by using Spring security.

At the very least you can enable HTTP basic authentication by using HTTP in your
Spring security configuration file. Similarly, you can expose your REST API using HTTPS
if the underlying server supports HTTPS.






Does REST work with transport layer security (TLS)?
(answer)
TLS or Transport Layer Security is used for secure communication between client and server.
 It is the successor of SSL (Secure Socket Layer). Since HTTPS can work with both SSL and TLS,
REST can also work with TLS.

Actually, REST says anything about Security, it's up to the server which implements that.
 Same RESTful Web Service can be accessed using HTTP and HTTPS if the server supports SSL.

If you are using Tomcat, you can see here to learn more about how to enable SSL in Tomcat.


Do you need Spring MVC in your classpath for developing RESTful Web Service? (answer)
This question is often asked to Java programmers with 1 to 2 years of experience in Spring. Short answer is Yes, you need Spring MVC in your Java application's classpath to develop RESTful web services using Spring framework. It's actually Spring MVC which provides all useful annotations e.g. @RestController, @ResponseCode, @ResponseBody, @RequestBody

Java Top 100 Interview question and answer

Q:- Wrapper Classes in java?
Ans:-Wrapper classes are used for converting primitive data types into objects,
like int to Integer
here are mainly two uses with wrapper classes.

1) To convert simple data types into objects
2) To convert strings into data types


The eight primitive data types byte, short, int, long, float, double, char and boolean.

Why we need wrapper class?
For example: While working with collections in Java,
we use generics for type safety like this: ArrayList<Integer> instead of this ArrayList<int>.
The Integer is a wrapper class of int primitive type.
We use wrapper class in this case because generics needs objects not primitives.


Wrapper class Example: Primitive to Wrapper



2. Wrapper class objects allow null values while primitive data type doesn’t allow it.



2Example1:-
class WrapperClass{
psvm(){

int i=5; //primitive data type
Interger li= new Integer(i); // so puting value insdie object is called Boxing

int j = li.intValue(); // unboxing


---------------------------------------------------------------------------------

Q: STring buffer and Stringbuilder?
Ans: StringBuffer is syncronised an thread  safe but StringBuilder is not synronised an not thread safe
StringBuffer and StringBuiler both are mutable

Example:

String str= new String("Hello");
StringBuffer buffer= new StringBuffer("Hello");
StringBuilder builder = new StringBuilder("Hello");

str.concat("Hello");
buffer.append("Hi");
builder.append("Hi");

System.out.println(str);
System.out.println(buffer)
System.out.println(builder);

OUTPUT:-
Hello  ->  here string is imutable that why its not changable  so it will take first string  data
Hi     -> here it is mutable means its  changeable that why append ouput came 
Hi    -> here also it is mutable so append out put came

here

-------------------------------------------------------------------
Q: diff btn == and equal()?

== method is cpmpare reference
equals() method its compare content of the object

example:

String str1= new String("Hello");
String str2 = new String("Hello");

if(str1 == str2){ // comparing reference
  system.out.println("str1== str2");
}else{
   system.out.println("str1  !== str2");
}

if(str.equals(str2)){
  system.out.println("str1 is equal str2");
}else{
    system.out.println("str1 not equals str2");
}

OUTPUT:-
str1 !== str2
str1 is equal str2
----------------------------------------
Q: Diff btn Abstract class and interface?

Abstarct class vs Interface.
1)An abstarct class  can provide complete default code
and/or just the details that have to be overriden.
where as interface can not provide any code at all, just the signature

2)in case of abstarct class, class may extend only one abstarct class
where as in interface A class may implement serveral interface.

3)abstarct class can have not abstract method.
where as interface can have all methods of interface are abstract

4)abstract class can have instance variable
where as An interface can not have instance variables.

5)abstract class can have any visisbilty: public, private, protected
where as interface visibility must be public or none

6)abstarct class can contain constructor
where as interface can not conatin constructors.

7)abstract classes are fast
where as interfce slow as it required extra indirection to find corresponding method in the actual class.


------------------------------------------------------------------------------
Q:What is overriding with example?
ans: declare the method in sub class which is already present in superclass is called overriding.

we can say method name , signature and return all should be same

-------------------------------------------------------------------------------
Exmaple: What will be the output of this program?

class Parent
{
  public static void foo(){
  System.out.println("i am foo in parent");
 }


 public  void bar(){
  System.out.println("i am bar in parent");
 }

}

class Child extends Parent{

   public static void foo(){System.out.println("i am foo in child
   
 }

public  void bar(){
  System.out.println("i am bar in child");
 }

public static void main(String args[])
{

Parent par = new Child();
Child child = new Child();

par.foo();
child.foo();

par.bar();
child.bar();

}

}


Output:

i am foo in super
i am foo in child
i am bar in child
i am bar in child

------------------------------------
Q:Servlet lifecycle?
ans:-there are five stages.
>servlet is loaded
>servlet is instantiated
>servlet is initialized
>service the request
>servlet is destroy.
-------------------------------------------------------------------------------------------
Q:- What is Request Dispature?
RequestDispature  interface is used to forward the request to another resource
that can be html,jsp or anothr servlet in same application


we can also use include content of another resource to the response
there are two method defines in this interface:
1) forward()
2) include()
----------------------------------------------------------------------------------------
Q:- What is bean in spring and Explain scops of bean in Spring?
A bean is an object that is instantiated, assembled and managed by sping IoC container.
They are managed by the Spring IOC COntainer.
There are 5 scops of bean spring
singleton
prototype
request
session
global-session
-----------------------------------------------------------------------------------------
Q:- What is dispature servlet and COntextLoaderListener?
DispatcherServlet:-
DispatcherSrevlet is front controller in spring mvc
application as it loads the spring bean configruation file
and initializes all the bean that have been configured

ContextLoaderListener:
ContextLoaderListener , the listener to start up and shut don the webapplicationCOntext in spring root.
------------------------------------------------------------------------------------------
Q: diff btn COnstructor injection and setter injection?

ANs:-
COnstructor injection:-                        setter injection:
1-no partial injection                         1-Partial injection   

2-Doesnt override the  setter property         2-Overrides the constructor property if both are difined.   

3-Create new instance if any modification      3- DOesnt create the  new instance if you change the property value
occurs.

4- Better for too may properties.              4- better for few properties.


----------------------------------------------------------------------------------------
QA:- What is autowiring in spring? what are the autowiring modes?
Ans"- Autowiring enables the programmer to inject the bean automatically.
We dont need to write explicit injection logic.
example:
<bean id="emp" class="com.java.Employee" autowire="byName"/>

There are 4 type of autowiring?
1- no:- This is default mode. it means autowiring is not enabled.

2-byName:- injection the bean based on the property name.it uses setter method

3- byType:- injection the bean on the property type. it sues setter method.

4- constructor:- it injects the bean using constructor.

-----------------------------------------------------------------------------------------
Q:- How to handle exception in spring mvc framwork?
Ans:- Spring MVC provide some way to help us acchive exception
1- Controller Based: we can define excepton handle methods in our contrller class
2- Global Exception Handle: Exception Handling is a cross cutting concern and spring provides
3- HandlerExcptionResolver
----------------------------------------------------------------------------------------
Q:- What are some of the important Spring annotation used?
Ans:-
@Controller
@PathVariable
@Qualifier
@Configruation
@Scope
@RequestMapping
@AutoWired
@Service
@Aspect
------------------------------------------------------------------------------------------
Q:- How to integarte spring and hibernate framwork?
Ans:- we can use Spring ORM module to integtae Spring and hibernate framwork.
Spring ORM also provide support for using Spring declarative transaction management.
-----------------------------------------------------------------------------------------

Q:- What is Hibernate?
Ans:- Hibernate is java based ORM tool that provides framework for mapping application domain onjects to the relational database tables and vice versa.

Hibernate provide reference implementaion of java Persistence API(JPA).
That makes its great choice as ORM tool with benefits of loose coupling
loose coupling means where dependency level reduce to maximum level.
Hibernate configruation are flexible and can be done from Xml configruation file as well as programmatically

-------------------------------------------------------------------------------------------------

Q:- diff btn get and load method in hibernate?
ans:Both are used to get the data from the database
get()-                                   load():
1- Returns null if obejct is not found   1- ThrowsObjectFoundException if object is not found.
2- get() method always hit the database  2- load() method doesnot hit the database, first it will search in cache memory not there than it will hit database.
if data is not there it will retrun null      if data is not there it will throw exception ObjectNotFOund
3- it return real object not proxy       3- it returns proxy object.
4- it should used if you are not sure    4- it should be use if you are sure  that instance.
    about the existence of instance.
==================================================================================

Q:- What is syncronizantion in the thread?
Ans:-syncrinizantion means - it is the process of enableing the lock of object
every java object is associcated with some lock mechanism , by default this lock is disable.
if you want to enable this lock so that it should work in multithreade enviroment.
multiple thread should not enter in critical section at once so there we can use syncronism.

just to enable of lock of object so that multiple thread should not allow to access concurrently.
----------------------------------------------------------------------------------------
Q:- diff btn runnable and thtead?
ans:-1-runnable is interface,  availble in java.lang package
it has only one absract method.. public void run()
 where as thread is concreat class availble in java.lang package, it is also implementing runnable interface
it has thread class has overwritten the run method they have given dumy implementation.
-----------------------------------------------------------------------------------
Q:Which one is better runnable or thread?
Ans:-its depend up on requirement

both implementation is same
only diffrence when you are writting your user define  thread class by extending thread
bcoz java doen not support multiple inheritance with class's so you can not extend any other class
bu when you are writting user define thread by implementing runnable interface so what will have happen
you have change to extend other class also you can implement other interface.

-----------------------------------------------------------------------------------------
Q:- What is auto boxing?
ans:-auto boxing is feature of java 5
you can not store primitive value into collection,

so when ever you have requirement to get primitive as a object type u have to use wrapper class.
you have to create object of Integer class u have to pass primitive than u can get wrapper

Java 5 support autoboxing and unboxing feature

AutoBoxing:-autoboxing is proceess of converting primitive to wrapper and wrapper to primitive automatically.

--------------------------------------------------------------------------------------------
Q:-diff java 7 and java 8?
java 7 they have added some new feature
1: in one catch block we can write multiple exception
2:- they have gievn rethroughing execption imporived type checking
3-dymond operator
4-they have given try with resource

java8:- they have added new feature
1:- lamda expresion and functional interface
2-Stream API
3-forEach() in iterable interface
4-collection
5-Date and time Api

Q:- Expalin java 1.8 feature?
Ans:-
1)in java 1.8 one of the best feature they have given functional programming
that you can achiev by usinh lamda expression so thats help you to reduce lot of code.
Boilerplate  code you will reduce by using lamda expresion

Boilerplate code means a piece of code which can be used over and over again. On the other hand,
anyone can say that it's a piece of reusable code.

2) They have given java streaming API means
you can write some application which will help in parrell processing.

3) Next they have given utility classes like collection class's , you have array class's
that has only predefine method so

-------------------------------------------------------------------------------------------
Q:- What is diffrence between Soap base web services  and Rest based web services?
Ans:-
1-SOAP based web service- it is based on some standard they are follwing some standard like interface hirarchy is there
But in Rest based web service there is no starbdard

2-Soap based web services you have to use the protocal SOAP/Http
where as Rest full web srvice only based on Http protocal.

3-SOAP based web service will fouce you to use only Xml for communcation but in RESt be]ased web service
you have flexibility, you can use json , you can use xml, you can use simple text, Html and pdf

4- SOAP web services are very havvy u must have to use IDE to develop that
BUt rest full  web service even you can develop in notpad its very light weight

5- To test the SOAP based web service its a long process but To test the rest based web service
you can test using simple url also.

--------------------------------------------------------------------------------------------
Q:-What is servletContext object?
ans: servletContext object is..
It is the web container object like  three type of scope you have in servlet
servletContext , servlet request and session
SO if you want to use any data  among ll servlet and all the user can use  that data
so that you can write the web.xml file servletContext parameter you can define
that is universal object that you can access any where from the application to the end user.

---------------------------------------------------------------------------------------------
Q:- how many instnace of serveltContext will be there?
Ans:- only one

Q:- What is servletCOnfig?
Ans:- servletConfig object it is related to your particular servlet
so if you have requiremnet  to use some data for  particular servlet
how many user comming that many object will be created.

-------------------------------------------------------------------------------------------

Q:- Diff btn HashSet and linkedHashSet?
Ans:
in HashSet data will store based on hashCode value of object you can call that random order
Random order means what ever order you are stroing    that order it will no be store
And HashSet internally using index representation so searching will be fast.

LinkedHashSet data will be store in the same order added by the user  and linkedHashSet internally using
node data structure so when you have requirement perform insert , delete operation
 than we will go for linkedHashSet .
------------------------------------------------------------------------------------
Q:- is there any collection that where can store key value paire and key has to be in sorted  order?

Ans:-TreeMap
suppose if i add Employee object in treeMap so what will  happen?
ANs;- it will give classCastException
so if you want use Employee object so key must be comprable object
when ever you inserting any key inside the treeMap that key must be comprable.
So Employee class has to implement comprable interface, it has override compareTo method
in that you have to return any field like employee id or employee name.
so it will give you the nature sorting order.

----------------------------------------------------------------------------------

Q:- What is diffrence between wait() and sleep()?
Ans:-
Wait() is instance method of java object class, java.lang Object class
It has three overloaded method 
wait() with witout any parameter
wait(long timeout) with long mili second
wait(long timeout, int nanos)  with long and integer

Sleep()- sleep is a static method of thread class
this method is  belong to thread class
it has only two overloaded method
sleep(long mili) with long parameter
sleep(long mili, int nanos) with long, int parameter


 Sleep() method is used to pause the execution of thread for specify demon of time

Main diffrence
when we are using in syncronized context
suppose your method is syncronized in that run() method , whatever method
if you are calling sleep() method or run() method than atucall diffrnce will come.

what will happen when you call sleep() method on running thread and than thread will go to sleep state
it will not lock of object but in the case of wait() method thread will go to wait() state
and emidailty it will release lock of object,
SO that is why wait() method is responsible to release the loack of an object
that is why we have to call wait() method from syncronised context only
BUt sleep() method can call non syncronised also 


Q- Diffrence btn HashMap and concurrentHashMap? why it is actually comes in picture?
Ans:- first i need to compare hashMap and hashTable
hashTable is syncronised 
so thats means, at a time one thread can access the data from the hashTable
it can write or read that is hashTabel.
But hashmap it is not syncronised means mutiple thread its support concurrence
Now in hashtable the problem  was..like when ever thread writing to it or reading it
it will lock entire object entire hashMap object so it will not allow any other
suppose hashMap having 1 lakh entry so only one thread can access other thread will not allow to access

Apache Spark Interview Question

    Top most 50 Spark interview question
---------------------------------------------------------

Q:-How do you connect Hadoop in your project?
ans:-Through Edge Node we can connect to hadoop cluster.

Q:-What is spark?
ans:-spark is open source cluster computing framework
it does real time data processing as well as batch processing also
but hadoop can do  batch processing only.

-------------------------------------------------------------------------------------
Q:- Why spark? why not hadoop?
Ans:-There are several reason
1-main reason hadoop can do batch processing only but spark can both real time data processing and
batch processing.

2-No of line of code which you write in spark is less compare to hadoop.

3-Hadoop is written in java but spark is written in scala.
scala is language which is written in scala.


--------------------------------------------------------------------------------------

Q:-What are the feature of spark?
ans:-
1- real- time stream processing:- spark offer real time processing data but
Hadoop MapReduce was able to handle and process data which is already present but
it does not support real-time data processing thats why spark comes in picture.

2- dynamic in nature:- easily possible to develop a parallel application.

3-in memeory computation in spark:-no need to fatch data every time from disk
4- Reusability:-
5-Fault Tolerance :- apche spark provide fault tolerance through spark abstraction-RDD ,
Spark RDD are capable to digonosis the failure of any worker node in the cluster.

6- Lazy Evaluation-lazy evaluation means that execuation will not start util an action is triggered
Transformation are lazy in nature means when we call some operation in RDD, it does not execute immediatly.

------------------------------------------------------------------------------------------------------
Q:-Advantage of spark?
ans:- 1-increase Managability
2- saves computation and increass speed
3-reduce complexities
4-optimization.
----------------------------------------------------------------------------------------------------
Q:-What are the component of spark Ecosystem?
ans:-Spark core
spark streaming
spark sql
Graphx
MLIB

----------------------------------------------------------------------------------------------------
Q:- Lifecycle of spark?
ans:-
Lifecycle:-
1-load data on cluster
2-create RDD(once u creat RDD yu have to transformation)
3-DO transformation
4-Perfrom action
5-create data fram
6- perform query on dataFram.


another word-

1- initial part loading the data
you can load streaming data.

2-once data is loaded we need transform the data
using the transformation we use map, flat map, filter

3-once transformation is complete we have to perform action of the transform data
----------------------------------------------------------------------------------------------

Lifecycle of Spark program
-------------------------------------------------------------------------------------------------
Q:-The following steps explain the lifecycle of a Spark application with standalone resource manager,
and Figure 3.8 shows the scheduling process of a spark program:
Ans:-
1-The user submits a spark application using the spark-submit command.

2-Spark-submit launches the driver program on the same node in (client mode) or on the cluster (cluster mode)
and invokes the main method specified by the user.

3-The driver program contacts the cluster manager to ask for resources to launch executor JVMs
based on the configuration parameters supplied.

4-The cluster manager launches executor JVMs on worker nodes.

5-The driver process scans through the user application.
Based on the RDD actions and transformations in the program, Spark creates an operator graph.

6-When an action (such as collect) is called, the graph is submitted to a DAG scheduler.
The DAG scheduler divides the operator graph into stages.

7-A stage comprises tasks based on partitions of the input data. The DAG scheduler pipelines operators...


in other word:-
1- spark has something called driver
driver is like master that is the one which is going to command to every one

2-inside the driver we have something called spark context it just like spring context.

3-so this spark context is going to controll worker nodes
inside the worker nodes we have executors node these executor basically will execute some task.

4-so there is driver which act as a master, Driver instruct the worker to execute some task
that is done by spark context so spark context instruct to worker to execute some task on each node

5- if particular task is failed than it get rebuild and spark context rebuild and sent to worker again.

--------------------------------------------------------------------------------------------------

Q:- Are there any benifit of Apache Spark over HAdoop MApreduce?
Ans:- Spark has ability to perform data processing 100 times faster than Mapreduce
Also SPark has inbuilt memory processing and libraries to perform multiple task together
like batch processing, streaming. intractive processing etc..


Q:-Define Apache spark core and how it is useful for scala developer?
Spark core is used for memory management, job monitoring, tolerate faults
scheduling jobs and interactive storage feature. Rdd is advance feature in spark core suitable for tolerating fault
RDD is a collection of distributed objects available across multiple nodes
that are generally manipulated in parallel.


Q:-HOw many cluster mode are supported in Apache Spark?
Ans:-
There are three cluster mode are supported
standalone
mesos
YARN cluster managers

Q:- What are the way to launch spark over YARN?
Ans:- when we running on YARN spark executors  runs as YARNS Container
spark support two mode for running on YARN

1- cluster mode:- it means it will be use full in production envirement by
 using spark-submit or spark-shell--master yarn-cluster command

2-client mode:- it is useful for development purpose
bu using this command :- spark-submit or spark-shell -- master yarn-client.


Q:- WHile submit the spark job what are the properties you are going to submit?
ans:


Q:-What is SparkContext in Apache Spark?
ans:SparkContext is the entry point of Spark functionality.
The most important step of any Spark driver application is to generate SparkContext.
 It allows your Spark Application to access Spark Cluster with the help of Resource Manager.
The resource manager can be one of these three- Spark Standalone,  YARN, Apache Mesos.

----------------------------------------------------------------------------------------------------

Q:- Who will create  sparkCOntext?
ans:-SparkContext is the entry gate of Apache Spark functionality.
The most important step of any Spark driver application is to generate SparkContext.
 It allows your Spark Application to access Spark Cluster with the help of Resource Manager.

-----------------------------------------------------------------------------------------------------

Q:How to Create SparkContext Class?
ans:-If you want to create SparkContext, first SparkConf should be made.
The SparkConf has a configuration parameter that our Spark driver application will pass to SparkContext.

or

In short, it guides how to access the Spark cluster. After the creation of a SparkContext object,
 we can invoke functions such as textFile, sequenceFile, parallelize etc.
The different contexts in which it can run are local, yarn-client, Mesos URL and Spark URL.
Once the SparkContext is created, it can be used to create RDDs, broadcast variable,
and accumulator, ingress Spark service and run jobs.
All these things can be carried out until SparkContext is stopped.

------------------------------------------------------------------------------------------------------
Q:-What are Stages in Spark?
Ans:-A stage is nothing but a step in a physical execution plan. Moreover,
It is a physical unit of the execution plan. In other words, Stage is a set of parallel tasks
i.e. one task per partition. Basically,
each job which gets divided into smaller sets of tasks is a stage.

3. Types of Spark Stages
Basically, stages in Apache spark are two categories

a. ShuffleMapStage in Spark

b. ResultStage in Spark

-----------------------------------------------------------------------------------------------------
Q:What is Spark Executor?
ans:-Basically, we can say Executors in Spark are worker nodes.
Those help to process in charge of running individual tasks in a given Spark job.
Executors also provide in-memory storage for Spark RDDs

--------------------------------------------------------------------------------------------------------
Q:-  Can we run Apache Spark without Hadoop?
ans:-Yes, Apache Spark can run without Hadoop,

standalone, or in the cloud. Spark doesn’t need a Hadoop cluster to work.
Spark can read and then process data from other file systems as well.HDFS is just one of the file systems that Spark supports.

Spark is a meant for distributed computing. In this case, the data is distributed across the computers and
Hadoop’s distributed file system HDFS is used to store data that does not fit in memory.


-------------------------------------------------------------------------------------------------------
Q:-Different Running Modes of Apache Spark?
Ans:-Apache Spark can be run in following three mode :

(1) Local mode
(2) Standalone mode
(3) Cluster mode

----------------------------------------------------------------------------------------------------------
Q:-What are the roles and responsibilities of worker nodes in the apache spark cluster?
Is Worker Node in Spark is same as Slave Node?

Ans:-Worker node refers to node which runs the application code in the cluster.
 Worker Node is the Slave Node. Master node assign work and worker node actually perform the assigned tasks.
 Worker node processes the data stored on the node,
they report the resources to the master. Based on the resource availability Master schedule tasks.

-----------------------------------------------------------------------------------------------------------

Q:-what are the features of dataframe in Spark?
List out the characteristics of DataFrame in Apache Spark.

Ans:-DataFrames are the distributed collection of data. In DataFrame, data is organized into named columns.
It is conceptually similar to a table in a relational database.

Out of the box, DataFrame supports reading data from the most popular formats, including JSON files,
Parquet files, Hive tables. Also, can read from distributed file systems (HDFS), local file systems,
cloud storage (S3), and external relational database systems through JDBC.
----------------------------------------------------------------------------------------------------

Q:-What are the different methods to run Spark over Apache Hadoop?
Ans:-
1. Local standalone mode — everything (spark , driver , worker , etc) are on the same machine locally.
Generally used for testing and develop the logic for spark application

2. YARN client — the client which runs driver program and submits the job is same ,
 and the workers (data nodes) are separate.

3. Yarn Cluster — the driver program runs on one of the dedicated data nodes and
the workers are separate. Most advisable for production platform.

3. Mesos:

Mesos is used in large scala production deploymen


-------------------------------------------------------------------------------------------------
Q:-What is Parquet file format ? Where Parquet format should be used ? how to convert data to Parquet format ?
Ans:-
Parquet is the columnar information illustration that is that the best choice
for storing long run massive information for analytics functions.
It will perform each scan and write operations with Parquet file.
Parquet could be a columnar information storage format.

Parquet is created to urge the benefits of compressed,
economical columnar information illustration accessible to any project,
 despite the selection of knowledge process framework, data model, or programming language.

Parquet could be a format which will be processed by variety of various systems: Spark-SQL,
 Impala, Hive, Pig, niggard etc. It doesn’t lock into a particular programming language
since the format is outlined exploitation, Thrift that supports numbers of programming languages.
as an example, Aepyceros melampus is written in
C++ whereas Hive is written in Java however they will simply interoperate on an equivalent Parquet information.


----------------------------------------------------------------------------------------------------

Q:- What is RDD?
ans:-
RDD stands for resilient distributed dataSet
RDD is immutable fault tolrance distributed object run as parallel
      or

RDDs are immutable(can’t be modified once created) and fault tolerant, Distributed because
it is distributed across cluster and Dataset because it holds data.

         or

RDD is a collection of distributed objects available across multiple nodes
that are generally manipulated in parallel.



So why RDD? Apache Spark lets you treat your input files almost like any other variable,
 which you cannot do in Hadoop MapReduce.
RDDs are automatically distributed across the network by means of Partitions.



Partitions

RDDs are divided into smaller chunks called Partitions, and when you execute some action,
a task is launched per partition. So it means, the more the number of partitions, the more the parallelism.
Spark automatically decides the number of partitions that an RDD has to be divided into
 but you can also specify the number of partitions when creating an RDD.
These partitions of an RDD is distributed across all the nodes in the network.

Creating an RDD
Creating an RDD is easy, it can be created either from an external file or
 by parallelizing collections in your driver. For example,

val rdd = sc.textFile("/some_file",3)
val lines = sc.parallelize(List("this is","an example"))
The first line creates an RDD from an external file,
and the second line creates an RDD from a list of Strings.

 Note that the argument ‘3’ in the method call sc.textFile() specifies the number of partitions
that has to be created. If you don’t want to specify the number of partitions,
 then you can simply call sc.textFile(“some_file”).

Actions/Transformations
There are two types of operations that you can perform on an RDD- Transformations and Actions.
Transformation applies some function on a RDD and creates a new RDD,
 it does not modify the RDD that you apply the function on.(Remember that RDDs are resilient/immutable).
 Also, the new RDD keeps a pointer to it’s parent RDD.


When you call a transformation, Spark does not execute it immediately, instead it creates a lineage.
 A lineage keeps track of what all transformations has to be applied on that RDD,
including from where it has to read the data. For example, consider the below example


val rdd = sc.textFile("spam.txt")
val filtered = rdd.filter(line => line.contains("money"))
filtered.count()
sc.textFile() and rdd.filter() do not get executed immediately,
it will only get executed once you call an Action on the RDD - here filtered.count().
An Action is used to either save result to some location or to display it.
You can also print the RDD lineage information
by using the command filtered.toDebugString(filtered is the RDD here).

RDDs can also be thought of as a set of instructions that has to be executed,
first instruction being the load instruction.
Caching
You can cache an RDD in memory by calling rdd.cache(). When you cache an RDD,
 it’s Partitions are loaded into memory of the nodes that hold it.


Caching can improve the performance of your application to a great extent.
In the previous section you saw that when an action is performed on a RDD,
it executes it’s entire lineage.
Now imagine you are going to perform an action multiple times on the same RDD
which has a long lineage, this will cause an increase in execution time.
Caching stores the computed result of the RDD in the memory thereby e
liminating the need to recompute it every time. You can think of caching as
if it is breaking the lineage, but
it does remember the lineage so that it can be recomputed in case of a node failure.

         
                         or

Resilient Distributed Datasets (RDDs)
RDDs are the main logical data unit in Spark. They are a distributed collection of objects,
 which are stored in memory or on disks of different machines of a cluster.
A single RDD can be divided into multiple logical partitions so that
these partitions can be stored and processed on different machines of a cluster.

RDDs are immutable (read-only) in nature. You cannot change an original RDD,
but you can create new RDDs by performing coarse-grain operations, like transformations, on an existing RDD.

-------------------------------------------------------------------------------------------------------
Q:-Why do we need RDD in Spark?
The key motivations behind the concept of RDD are-

Iterative algorithms.
Interactive data mining tools.
DSM (Distributed Shared Memory)

The main challenge in designing RDD is defining a program interface that provides fault tolerance efficiently.
To achieve fault tolerance efficiently,RDDs provide a restricted form of shared memory,


------------------------------------------------------------------------------------------------------
Q:- Features of RDD

Ans:-Partitioning
In-memory Computation
Immutability
Lazy Evaluation

Distributed
Resilient

-----------------------------------------------------------------------------------------------------

There are two basic operations which can be done on RDDs. They are:
Ans:-
Transformations
Actions

Transformations: These are functions which accept existing RDDs as the input and outputs one or more RDDs.
 The data in the existing RDDs does not change as it is immutable.
 Some of the transformation operations are shown in the table given below:

Functions Description
map()         Returns a new RDD by applying the function on each data element
filter() Returns a new RDD formed by selecting those elements of the source on which the function returns true
reduceByKey() Used to aggregate values of a key using a function
groupByKey() Used to convert a (key, value) pair to (key, <iterable value>) pair
union()         Returns a new RDD that contains all elements and arguments from the source RDD
intersection() Returns a new RDD that contains an intersection of elements in the datasets
These transformations are executed when they are invoked or called.
Every time transformations are applied, a new RDD is created.

Actions: Actions in Spark are functions which return the end result of RDD computations.
 It uses a lineage graph to load the data onto the RDD in a particular order.
 After all transformations are done, actions return the final result to the Spark Driver.
Actions are operations which provide non-RDD values. Some of the common actions used in Spark are:

Functions Description
count()         Gets the number of data elements in an RDD
collect() Gets all data elements in the RDD as an array
reduce() Aggregates data elements into the RDD by taking two arguments and returning one
take(n)         Used to fetch the first n elements of the RDD
foreach(operation) Used to execute operation for each data element in the RDD
first() Retrieves the first data element of the RDD

---------------------------------------------------------------------------------------------------
Q:-How many way to creat RDD?
Ans:-Creating an RDD
An RDD can be created in three ways:

1-By loading an external dataset
You can load an external file into an RDD.
The types of files you can load are csv, txt, JSON, etc.
 Here is an example of loading a text file into an RDD.

2-By parallelizing the collection of objects
When Spark’s parallelize method is applied on a group of elements, a new distributed dataset is created.
This is called an RDD.

3-By performing transformations on existing RDDs
One or more RDDs can be created by performing transformations on the existing RDDs.
The below figure shows how a map() function can be used.


----------------------------------------------------------------------------------------------------
 Q:-What is Map transformation operation in Apache Spark?
What is the need for the Map transformation?
What processing can be done in the Map in Spark explain with example?

Ans:-Map is a transformation applied to each element in a RDD and it provides a new RDD as a result.
In Map transformation, user-defined business logic will be applied to all the elements in the RDD.
It is similar to FlatMap, but unlike FlatMap Which can produce 0, 1 or many outputs,
Map can only produce one to one output.
Map operation will transforms an RDD of length N into another RDD of length N.

A——->a
B——->b
C——->c
Map Operation

Map transformation will not shuffle data from one partition to many. It will keep the operation narrow.

Q:-Explain the flatMap() transformation in Apache Spark.?
ans:-









---------------------------------------------------------------------------------------------

Q:- Diffrence between dataFrame and dataSet?
ans:-











-----------------------------------------------------------------------------------------------
Q:-What is diffrence map and FlatMap?
ans:-











------------------------------------------------------------------------------------------------------
Q:- What is diffrence between groupBy  key and reduce By key?
Ans:-











--------------------------------------------------------------------------------------------------------
Q:-How RDD fault Tolerence?
Ans:-










-----------------------------------------------------------------------------------------------------------
Q:- Where we need to map Partition?
Ans:-










--------------------------------------------------------------------------------------------------------
Q:-How to reduce no of partitions?
Ans:-








-------------------------------------------------------------------------------------------------------
Q:-How we will read file formate?
Ans:-







--------------------------------------------------------------------------------------------------------
Q:-What are the type of join in spark?
Ans:-









-------------------------------------------------------------------------------------------------------
Q:-If hive job is going to taking much time than what are the step you are going to take?
Ans:-








-----------------------------------------------------------------------------------------------------
Q:-How to cptimize the query?
Ans:-







------------------------------------------------------------------------------------------------------
Q:-How we will do unit testing ?

Ans:-





-----------------------------------------------------------------------------------------------------
Q:-If any partitioned is corrupt than how to handle ?
Ans:-








-----------------------------------------------------------------------------------------------------
Q:- Diffrence between DAG and Lenage?
Ans:-










-----------------------------------------------------------------------------------------------------
Q:-How to increased partitioned?
Ans:-









---------------------------------------------------------------------------------------------------
Q:-If i am applying aggregation after grouping what rdd we have to use?
Ans:-







----------------------------------------------------------------------------------------------------
Q:-Diffrence between order by and sort By?
Ans:-












------------------------------------------------------------------------------------------------------
======================================================================================================

    Hive Interview qustion
=======================================================================================================

Q:-What is Hive?








-----------------------------------------------------------------------------------------------------
Q:-Where to use hive?
Ans:-







-------------------------------------------------------------------------------------------------------
Q:- what are the feature of hive?
Ans:-







--------------------------------------------------------------------------------------------------------
Q:-What are tabular function in Hive?
Ans:-









--------------------------------------------------------------------------------------------------------
Q:-How to delete some data in hive?
Ans:-








--------------------------------------------------------------------------------------------------------
Q:-What are the table variable in HIve?
Ans:-









---------------------------------------------------------------------------------------------------------
Q:-Suppose if we drop internal table what will happen?
Ans:-








--------------------------------------------------------------------------------------------------------
Q:-Suppose if we drop external table than what will happen?








---------------------------------------------------------------------------------------------------------
Q:-What are the partitioned ? Where you implemented in your project?
Ans:-









--------------------------------------------------------------------------------------------------------
Q:-How many type of partitioned in Hive?
Ans:-









--------------------------------------------------------------------------------------------------------
Q:-What type of partitioned we should go?
Ans:-








-------------------------------------------------------------------------------------------------------
Q:-Why we use bucketing?
Ans:-









-------------------------------------------------------------------------------------------------------
Q:-What are file formate in Hive?
Ans:-











======================================================================================================