How can we avoid duplicate rows. without using distinct command

Questions by subbaraovangalli   answers by subbaraovangalli

Showing Answers 1 - 75 of 82 Answers

Shiril Kumar Dubey

  • Mar 27th, 2007
 

This code is not working perfectly.


select a.deptno
from scott.emp a
, scott.emp b
where a.deptno =b.deptno
and a.rowid<>b.rowid
/

this code is not working   to avoid dup. records.
                                  when ever we use  1) a.deptno=b.deptno and
 2)a.rowid<>b.rowid then  oracle satisfies both conditions due to and operator.
in emp table suppose take 10 as deptno(which is having 3 times in that table)
according to this coding  1)condition satisfies ie both tables having same deptno
2)condition satisfies   so that record will not be retrived. so it is satisfied for one record with deptno as 10. what about the remaining two records having deptno as 10.  the 2)condition will not satisfies for these two records thats why these records retrived . (these r duplicate records)

  Was this answer useful?  Yes

MDP

  • Apr 11th, 2007
 

This query will return the first row for each unique id in the table. This query could be used as part of a delete statement to remove duplicates if needed.

SELECT ID

FROM func t1

WHERE ROWID = (SELECT MIN (ROWID)

               FROM func               WHERE ID = t1.ID)

Also: You can use a group by without a summary function
SELECT ID

FROM func t1

GROUP BY id

Mudit Sharma

  • Aug 8th, 2007
 

Hi, 
   I think this is not the right query for getting unique values of a column from a table.
Like a table EMP contains the column like DEPTNO having duplicate values.

For getting unique values of DEPTNO query will be like this:

<
select a.DEPTNO from emp a, emp b
where a.deptno=b.deptno
and     a.rowid not in (select max(rowid) from emp)
group by a.deptno;
>

Regards
Mudit

  Was this answer useful?  Yes

sribala30

  • Mar 21st, 2008
 

Group By can be used, to avoid duplicates.

Suppose if you have a table "T" with a column "C" having values 1,2,3,4,1,2,3,4.. .. ..

The following query returns unique values:

Select C From T Group By C;

ashwinee2

  • Apr 10th, 2008
 

There is a very simple solution to this.

Suppose there is a table emp containing employee ids:
EmpID
1
1
2
2
2
3
3

Now you need to get the value 1,2 and 3

The query should be
SELECT EmpID from emp
group by EMPID

This will return you all distinct rows from the table

  Was this answer useful?  Yes

ITsforu

  • Apr 13th, 2008
 

You can use rank analytical function over all rows in your table and partition rows on row number.

select emp_id,ename from (  select emp_id,ename, rank() over (partition by emp_id,ename order by rowid  emp_rank from emp)  twhere emp_rank=1 order by emp_id,ename 

  Was this answer useful?  Yes

SQL> select * from emp; ENAME             ENO     SALARY---------- ---------- ----------abc                  1      18000def            2      15000ghi              3      22000abc              1      18000def            2      15000ghi              3      22000 Using Group by function we can avoid the duplicate records instead of using distinct keyword. SQL> select ename,eno,salary from emp group by ename,eno,salary; ENAME             ENO     SALARY---------- ---------- ----------abc              1      18000

def              2      15000

ghi            3      22000

SQL> select distinct * from emp;
ENAME             ENO     SALARY
---------- ---------- ----------
abc              1      18000
ghi              3      22000
def            2      15000

  Was this answer useful?  Yes

neelapu

  • May 27th, 2010
 

if u want to see the duplicate values in a column u can use select stmt. 

if u wnat to delete the duplicate values in a column u can use delete stmt.

select deptno from emp where rowid not in (select max(rowid) from emp group by deptno);

delete from emp where rowid not in(select max(rowid) from emp group by deptno);

  Was this answer useful?  Yes

Use analytical function as below to delete the duplicate rows.
Delete FROM ROWID IN
(SELECT ROWID FROM
(SELECT ROWID, ROW_NUMBER() OVER (PARTITION BY EMP_NAME ORDER BY EMP_ID) RN
FROM EMP)
WHERE RN > 1));
This will delete the duplicate rows from the emp table on the basis of duplicates in employee name.

  Was this answer useful?  Yes

diva

  • Sep 27th, 2011
 

select dup_col from emp group by dup_col;

  Was this answer useful?  Yes

Tauseef

  • Oct 6th, 2011
 

SELECT FIRST_NAME

FROM EMPLOYEES E

WHERE ROWID = (SELECT MIN (ROWID)
FROM EMPLOYEES WHERE FIRST_NAME = E.FIRST_NAME)

It will check for each row values if it has duplicate then using rowid min of that duplicate will be avoided

Excellent.. It works like a charm. I was googling for more than one hour, and finally my search ended here. Would be great if there was a little explanation to add. Thanks again

  Was this answer useful?  Yes

AMAR

  • Feb 24th, 2015
 

Using Group by clause

  Was this answer useful?  Yes

Selvaraj

  • May 28th, 2015
 

SELECT *
FROM (SELECT N.*, COUNT(*)OVER(PARTITION BY N.EMP_NAME) RN
fROM EMPLOYEE N)
WHERE RN > 1;
the inner query will return number of rows, if the row count >1 means it is duplicate record.

  Was this answer useful?  Yes

Rashid

  • Jun 21st, 2015
 

Here It is:

Code
  1. SELECT * FROM tablename a

  2.       WHERE a.ROWID > ANY (SELECT b.ROWID

  3.                              FROM tablename b

  4.                             WHERE a.fieldname = b.fieldname

  5.                               AND a.fieldname2 = b.fieldname2)

  Was this answer useful?  Yes

Malyadri M

  • Jun 22nd, 2015
 

Code
  1. SELECT ROWID, E.*

  2. FROM EMP E

  3. WHERE ROWID = ( SELECT MAX(ROWID)

  4.                            FROM EMP1 E1

  5.                            WHERE E.ENO = E1.ENO

  6.                         );

  Was this answer useful?  Yes

Pritty

  • Feb 29th, 2016
 

The GROUP BY keyword helps to retrieve the unique values without using DISTINCT keyword.

  Was this answer useful?  Yes

Give your answer:

If you think the above answer is not correct, Please select a reason and add your answer below.

 

Related Answered Questions

 

Related Open Questions