OSDN Git Service

mmc: slot-gpio: Fix debounce time to use miliseconds again
authorMarek Szyprowski <m.szyprowski@samsung.com>
Fri, 28 Sep 2018 12:20:40 +0000 (14:20 +0200)
committerGreg Kroah-Hartman <gregkh@linuxfoundation.org>
Sat, 13 Oct 2018 07:33:11 +0000 (09:33 +0200)
commit 1b09d9c232cdaea59fb50ac437d3921ed1f1eafb upstream.

The debounce value passed to mmc_gpiod_request_cd() function is in
microseconds, but msecs_to_jiffies() requires the value to be in
miliseconds to properly calculate the delay, so adjust the value stored
in cd_debounce_delay_ms context entry.

Fixes: 1d71926bbd59 ("mmc: core: Fix debounce time to use microseconds")
Fixes: bfd694d5e21c ("mmc: core: Add tunable delay before detecting card
after card is inserted")
Cc: stable@vger.kernel.org # v4.18+
Signed-off-by: Marek Szyprowski <m.szyprowski@samsung.com>
Reviewed-by: Linus Walleij <linus.walleij@linaro.org>
Signed-off-by: Ulf Hansson <ulf.hansson@linaro.org>
Signed-off-by: Greg Kroah-Hartman <gregkh@linuxfoundation.org>
drivers/mmc/core/slot-gpio.c

index 2a83368..86803a3 100644 (file)
@@ -271,7 +271,7 @@ int mmc_gpiod_request_cd(struct mmc_host *host, const char *con_id,
        if (debounce) {
                ret = gpiod_set_debounce(desc, debounce);
                if (ret < 0)
-                       ctx->cd_debounce_delay_ms = debounce;
+                       ctx->cd_debounce_delay_ms = debounce / 1000;
        }
 
        if (gpio_invert)