diff options
author | Linus Walleij <linus.walleij@linaro.org> | 2019-08-04 02:38:52 +0200 |
---|---|---|
committer | Mark Brown <broonie@kernel.org> | 2019-08-28 14:11:01 +0100 |
commit | 3bd158c56a56e8767e569d7fbc66efbedc478077 (patch) | |
tree | 0ba5fa694e82a0f5368d1cede6a46b3913f756fe /drivers/spi/spi-omap2-mcspi.c | |
parent | 0f0581b24bd019dfe32878e4c1bde266c7364e02 (diff) |
spi: bcm2835: Convert to use CS GPIO descriptors
This converts the BCM2835 SPI master driver to use GPIO
descriptors for chip select handling.
The BCM2835 driver was relying on the core to drive the
CS high/low so very small changes were needed for this
part. If it managed to request the CS from the device tree
node, all is pretty straight forward.
However for native GPIOs this driver has a quite unorthodox
loopback to request some GPIOs from the SoC GPIO chip by
looking it up from the device tree using gpiochip_find()
and then offseting hard into its numberspace. This has
been augmented a bit by using gpiochip_request_own_desc()
but this code really needs to be verified. If "native CS"
is actually an SoC GPIO, why is it even done this way?
Should this GPIO not just be defined in the device tree
like any other CS GPIO? I'm confused.
Cc: Lukas Wunner <lukas@wunner.de>
Cc: Stefan Wahren <stefan.wahren@i2se.com>
Cc: Martin Sperl <kernel@martin.sperl.org>
Cc: Chris Boot <bootc@bootc.net>
Signed-off-by: Linus Walleij <linus.walleij@linaro.org>
Link: https://lore.kernel.org/r/20190804003852.1312-1-linus.walleij@linaro.org
Signed-off-by: Mark Brown <broonie@kernel.org>
Diffstat (limited to 'drivers/spi/spi-omap2-mcspi.c')
0 files changed, 0 insertions, 0 deletions