How to get millions row of data from postgresql with multiple filter?

Hi, currently I’m having a problem with reading the data which contains up to millions rows from postgresql. Because It would be to many queries if I write query for every filter, I decided to get the raw data from db and filter it using dataframe. Unfortunately it cost a long time for it to be able to get done.
this was how my filter looks like
Screenshot%20from%202019-03-14%2011-59-21

And for my code it looks like this

mtd=pd.read_sql(mtdq,con)
ytd=pd.read_sql(ytdq,con)
lytd=pd.read_sql(lytdq,con)

if reg=='All Region':
    if dc!='All DC':
        mtd=mtd[mtd['warehouse']==dc]
        ytd=ytd[ytd['warehouse']==dc]
        lytd=lytd[lytd['warehouse']==dc]
    
if reg!='All Region':
    if dc=='All DC':
        wh=region[reg]
        mtd=mtd[(mtd['warehouse']==wh[-1])]
        ytd=ytd[(ytd['warehouse']==wh[-1])]
        lytd=lytd[(lytd['warehouse']==wh[-1])]
        wh.pop()
        for wh in wh:
            mtd_=mtd[(mtd['warehouse']==wh)]
            ytd_=ytd[(ytd['warehouse']==wh)]
            lytd_=lytd[(lytd['warehouse']==wh)]
            
            mtd=pd.concat([mtd,mtd_])
            ytd=pd.concat([ytd,ytd_])
            lytd=pd.concat([lytd,lytd_])
        
    else:
        mtd=mtd[mtd['warehouse']==dc]
        ytd=ytd[ytd['warehouse']==dc]
        lytd=lytd[lytd['warehouse']==dc]
        
if brand!='All Brand':
    mtd=mtd[mtd['brand']==brand]
    ytd=ytd[ytd['brand']==brand]
    lytd=lytd[lytd['brand']==brand]
    
if kat!='All Category':
    mtd=mtd[mtd['kat1']==kat]
    ytd=ytd[ytd['kat1']==kat]
    lytd=lytd[lytd['kat1']==kat]
    
if chan!='All Channel':
    mtd=mtd[mtd['channel']==chan]
    ytd=ytd[ytd['channel']==chan]
    lytd=lytd[lytd['channel']==chan]

Still didnt know what the best way. Please help me, thankyou!